Microblog

When people ask me for advice on buying an SD card, I usually say something counterintuitive: don’t just grab the biggest one you can find.

Here’s why:

  • Size sweet spot – Bigger cards feel convenient, but in practice I rarely need more than 64GB per travel day (and that’s shooting RAW on a modern sensor). A 64GB or 128GB card is usually the best balance, depending on your style of photography.
  • Carry more cards – They weigh almost nothing, and having backups protects you from loss or failure. Losing a smaller, cheaper card hurts a lot less than losing a giant one filled with a week’s worth of shots.
  • Longevity – Cards don’t last forever. Their lifespan is measured in read/write cycles. Spreading your usage across multiple cards makes your whole setup last longer.
  • Avoid microSD + adapter – They’re slower, more fragile, and worse with heat. Stick to full-size SD cards.
  • Speed matters – If your camera supports it, go with UHS-II. Faster write speeds mean less waiting and better burst performance.

Just wrapped up a big project and finally had some time for blog maintenance. I upgraded the codebase - it was as simple as running a few commands in my local ddev environment:

  • composer update
  • drush updb
  • drush config:export

Then, on the production environment:

  • composer install
  • drush deploy

Who said Drupal upgrades are hard?

Footnote: This blog is still running on the latest Drupal 10 (as of writing), but the Drupal 11 upgrade is on the horizon. I expect it to be just as smooth.

Lately I’ve been working on a large migration project where thousands of files had to be moved into Drupal as file entities. The source files were stored on Amazon S3, and with the help of the s3fs module and Drupal’s Migration API, it was straightforward to write a functioning migration.

However, there was one important quirk worth sharing—something you might run into if you’re doing a similar migration.

Rollbacks and File Deletion in Drupal

Drupal’s Migration API supports the full lifecycle of entities:

  • create – migrate new entities,
  • update – re-run migrations to refresh data,
  • delete (rollback) – remove entities that were created by a migration.

When rolling back file entities, Drupal doesn’t just delete the database record. It also unlinks the actual file from the filesystem.

For local file storage, this is a great feature: rolling back means your database and filesystem stay in sync.

But with S3, that behavior is problematic. Files on S3 often need to persist, since they might be referenced outside of Drupal by other systems or services. Accidentally unlinking them during a rollback is not acceptable.

First Attempt: Predelete Hook

My first idea was to prevent deletion at the entity level by throwing an exception from a hook_ENTITY_TYPE_predelete().

Unfortunately, that doesn’t work in this case. Drupal actually unlinks the file before the predelete hook is dispatched. By the time the hook is called, it’s already too late—the file has been removed.

The Real Fix: Custom Destination Plugin

The solution was to extend the EntityFile destination plugin that Drupal provides. By overriding the part of the process that removes files, I could stop Drupal from unlinking anything while still allowing the rollback of file entities themselves.

Here’s the custom plugin class I ended up with:

<?php

namespace Drupal\mymodule\Plugin\migrate\destination;

use Drupal\file\Plugin\migrate\destination\EntityFile;
use Drupal\migrate\Attribute\MigrateDestination;

/**
 * Provides a migration destination plugin for File entities.
 */
#[MigrateDestination('mymodule:file')]
class EntityFileNoRollback extends EntityFile {

  /**
   * {@inheritdoc}
   *
   * Files using this destination plugin will not be rolled back even if they
   * appear as they were rolled back.
   */
  public function rollback(array $destination_identifier) {
    // Do nothing.
  }

}

With this in place, I just updated my migration YAML to use the new destination plugin:

destination:
  plugin: mymodule:file

💡 Planning to work remotely from Egypt? Here are a few things to keep in mind as a digital nomad:

  1. 📶 Mobile coverage is inconsistent—even in Cairo, you may struggle to get a signal in some areas.
  2. 🔄 If your work requires a constant connection, have a backup (local SIM, eSIM, or portable hotspot)
  3. 🏨 Paid hotel internet is often just a replacement for free Wi-Fi, not an upgrade—and it's usually pricey.
  4. 🚫 Some hotels may not offer Wi-Fi at all.
  5. 🏜️ The more rural the area, the worse the connectivity.
  6. 📱 A 4G mobile connection is often more stable than hotel Wi-Fi for video calls.
  7. 🏠 Even in private homes, expect random internet restarts.
  8. 📉 Average speeds hover around 10 Mbps in the best spots.
  9. 🎥 For Zoom/Google Meet calls, stick to 4G when possible for better stability.
  10. 📊 Broadband internet is usually prepaid and metered. In guesthouses or private homes with "free internet," expect occasional data outages—you’ll likely need to contact staff or your landlord to top it up.

Hello, 2025!

As the year came to a close, I made a final push to wrap up a few tasks that had been on my to-do list for a while. One of those was releasing the Drupal 10+ compatible version of the Feeds XLS utility module. This module integrates XLS format input documents into the Feeds ecosystem. If you’d like to learn more, you can check out the module’s page here: https://www.drupal.org/project/feeds_xls

I want to extend my gratitude to sdrycroft for his trust and quick responses when I reached out about becoming a co-maintainer. Moving forward, I’ll be maintaining the 2.x version of the module, while the Drupal 7-compatible 1.x version will remain available for those still using it.

While this is a relatively small module, I don’t expect a flood of large enterprise sites still on Drupal 7 to upgrade to Drupal 10 or 11 just for this. However, every bit of progress counts! If you’re maintaining a module that hasn’t yet been updated for Drupal 10/11, I encourage you to consider making a new release or opening up the project to additional maintainers who can help with the process.

Wishing everyone happy holidays—and for my friends in the southern hemisphere, enjoy your summer vacations!

How to Process Pictures in Bulk?

When working with large batches of photos, efficiency is key. My workflow is streamlined by taking time to crop photos directly on my camera (Fujifilm X-T5). This camera's ability to save images in both RAW and JPG formats, with a predefined color profile, significantly reduces the need for extensive post-processing.

While this approach saves me dozens—if not hundreds—of hours in color grading, some tasks still need attention before publishing. In the Drupal community, we follow specific guidelines for publishing photos to platforms like Flickr, ensuring proper attribution and copyright compliance.

Fortunately, tools like exiftool make this process straightforward. For example, here’s the command I used to prepare photos from DrupalCon Singapore 2024:

exiftool -artist="Jakub Piasecki" -copyright="© 2024 Jakub Piasecki" -sep ", " -keywords="DrupalCon, Drupal, Singapore, Open Source, Community, IT" /home/zaporylie/Pictures/DrupalCon\ Singapore\ 2024/Day\ 3/Contribution\ Day

This single command allowed me to:

  • Add keywords for better
  • Embed copyright
  • Process all photos in a specific folder in one go.

By investing a little effort upfront, you can save time and maintain professional standards for your published work.

Image
Dries Buytaert looking at the audience from the stage during his keynote.

Hello from Singapore.

As mentioned in the previous post, I didn't come here only as a delegate but also decided to volunteer during the conference. I helped in two areas: as a Drupal Splash Awards Judge and as a DrupalCon photographer.

All Splash Awards-related work was done way ahead of the event - coordinated by Julia Topliss Splash Awards was gathering projects from across APAC region. I was impressed not only by the number but also the quality of submissions. It never stops to amaze me all the applications where Drupal proves itself not only useful but way ahead of its competition in terms of features, quality, and scalability.

Being a conference official photographer was demanding, I can't lie about that. Taking thousands of photos, selecting hundreds of unique shots, and processing and publishing them during the conference took a lot of effort. The stress of missing out on the shot is real and I have a new level of appreciation for everyone doing this on a daily basis.

Below you'll find a group photo I took this year. For more photos go to https://www.flickr.com/groups/drupalconsingapore2024/

Image
DrupalCon Singapore Group Photo

Last year I had the pleasure of participating in Websummit in Lisbon. You can read more about my experience in this blog entry: https://piasecki.no/microblog/17

However this year I decided to skip Websummit as it conflicts with my plans to visit Singapore for DrupalCon Asia 2024! Happy to report that these plans worked out and I will see you in Singapore from Dec 9 to 11. More about my role in the upcoming post.

It's been a while since my last update but I intend to post more in the following weeks.

Today a short bash snippet I use to sync files from my local machine to my NAS. Adding it here as a reference for myself but feel free to adapt it in your workflows.

rsync --remove-source-files --recursive --archive --compress --human-readable --partial --progress --info=progress2 ~/Pictures/ myNAS:Photos/

Let me go through the option flags I am using:
--remove-source-files probably needs no explanation - files are removed once uploaded
--recursive rsync will go over files AND directories inside ~/Pictures/ dir.
--archive keeps ownership, permissions, symlinks, timestamps
--compress compresses files (some files that are compressed already may be excluded automatically)
--human-readable file sizes are in a human-readable format
--partial keeps partially transferred files if interrupted
--progress displays file progress on screen - appends file that is currently being transferred
--info=progress2 displays the total percentage of files transferred and ETA