A source's safety can depend on a single photograph. Here's how investigative journalists, human rights researchers, and activists anonymize images — and what tools they use.
Try Blurify free — blur & redact in your browser, nothing uploaded.
Open tool →When an investigative journalist publishes a story, the images they use can be as dangerous as the words. A visible face in a crowd photograph, a distinctive tattoo on a protest march, the unique vehicle parked outside a safe house, a name tag on a hospital employee — any of these details can identify sources, witnesses, or activists to hostile actors.
The stakes are not theoretical. Human rights organizations and press freedom groups document cases every year where sources were identified from published images, with consequences ranging from dismissal and harassment to imprisonment and violence. The obligation to protect sources does not end at the written word — it extends to every pixel in every image that accompanies a story.
At the same time, audiences deserve evidence. Visual documentation is often the most compelling and credible element of an investigation. The challenge journalists face is publishing compelling images while ensuring that the details which could identify vulnerable individuals are removed before publication. This requires a deliberate, systematic workflow — not an afterthought.
Most journalists instinctively think about faces when they consider anonymization. But faces are just one of many identifying features that need to be considered in a rigorous pre-publication review:
Before publishing — after anonymization
Before — identity visible
After — identity protected
Every digital photograph carries embedded metadata — information stored inside the image file itself, invisible to the viewer but readable by anyone with the right software. EXIF (Exchangeable Image File Format) data embedded in smartphone photographs typically includes:
A source who photographs a document in their office and sends it to a journalist may inadvertently embed their precise office location, the exact time they took the photograph, and the make and model of their personal phone. This combination of metadata can be sufficient to identify and locate the source even if no visual identifying features are present in the image content itself.
The John McAfee case is one of the more widely publicized examples: journalists published a photograph of McAfee while he was in hiding, and the EXIF data embedded in the image included GPS coordinates that revealed his location in Guatemala. This is not an isolated incident.
Blurify's exports do not carry EXIF data from the original file. When you export a blurred image from Blurify, the output is a clean PNG, JPEG, or WebP file with no location data, no timestamps, and no device information inherited from the original. For complete metadata protection on the source's end, sources should strip metadata from images on their own device before transmitting them using a tool like ExifTool, MAT2, or the built-in metadata stripping in the Signal messaging app.
Facial recognition technology has become dramatically more accessible and capable over the past decade. Tools that once required nation-state intelligence budgets are now available commercially or freely on the open web. Services like PimEyes allow anyone to upload a photograph and search for matching faces across billions of indexed web images. Commercial law enforcement databases cover hundreds of millions of faces.
This changes the calculus for journalists significantly. A partially obscured face in a published photograph that would have been considered adequately anonymized a decade ago may now be matchable with high confidence. Even side-profile views and images taken from behind — which were historically considered safe — can now be matched by gait recognition and ear biometrics systems deployed in some jurisdictions.
Best practices for face anonymization have evolved accordingly:
Before editing, examine the full image carefully — not just the main subject. Work from foreground to background. List every face, tattoo, license plate, location marker, visible document, and device screen you can find. Pay attention to reflections and background details. Zoom in on parts of the image you might otherwise overlook.
For group photos or crowd images, this review can take significant time. Build it into your pre-publication timeline. A missed identifying feature discovered after publication — or, worse, by a hostile actor — cannot be undone.
Before any editing, strip EXIF metadata from the original file. Use ExifTool (command-line), MAT2 (Metadata Anonymisation Toolkit, available on Linux), or a browser-based EXIF stripper. This step should ideally happen on the source's device before the image is transmitted, but at minimum it should happen before the image enters your publishing workflow.
Signal, the encrypted messaging application, automatically strips metadata from photos when they are sent — making it a good channel for receiving sensitive images from sources. Other messaging platforms do not reliably strip metadata.
Open the image in Blurify. For maximum protection in high-risk journalism contexts, use Redact mode (solid black fill). For lower-risk contexts where visual continuity matters more, a heavy Gaussian blur (radius 20–30px) provides good protection.
Cover every identifying feature from your audit in step 1. Use the freehand tool for irregular shapes — tracing around a tattoo, an unusual piece of clothing, or an awkwardly positioned face is more precise and complete than a rectangle. Draw shapes slightly larger than the feature itself to ensure full coverage at the edges.
For images with many faces, use the Detect Faces button to automatically generate blur shapes for all detected faces, then manually review the result to add shapes for any faces the detector missed and confirm no shapes were incorrectly placed.
Export the redacted image and open the exported file in a fresh browser tab or image viewer. Zoom in to every redacted region and confirm that the original content is fully covered with no visible edges or partial reveals. Pay particular attention to the margins of blur shapes where the effect tapers — a blur that softens at the edges may leave facial features or text partially visible.
For critical publications, have a colleague independently review the exported image with fresh eyes. A second person often catches features that the person who did the original editing has become visually accustomed to.
Only the exported, redacted file should enter your publishing workflow, CMS, or email chain. The original file should be stored in a secure, access-controlled location — if it needs to be preserved as evidentiary material — or securely deleted using a tool like BleachBit. Never share the original file through the same channels as the publication.
Organizations like the Electronic Frontier Foundation (EFF), Access Now, and the Freedom of the Press Foundation recommend a layered approach to image security in journalism:
Anonymizing images is not just a technical precaution — it is an ethical responsibility. A journalist who publishes an identifiable image of a source after implicitly or explicitly promising confidentiality has broken a fundamental commitment, regardless of their intentions. The commitment to source protection is not conditional on technical difficulty. If the workflow to do it correctly requires an extra ten minutes, that time is owed to the source.
This principle extends beyond traditional journalism. Human rights documentarians, academic researchers working with vulnerable populations, social workers, healthcare professionals, and NGO field workers all operate under analogous obligations when they capture or handle images of people in sensitive contexts.
The practical tools to fulfill this obligation are freely available and, in the case of browser-based tools like Blurify, require no technical expertise. The question is not whether anonymization is difficult — it isn't — but whether it is built into the workflow as a standard step rather than an afterthought.
A strong Gaussian blur (radius 20px or more, covering the full head including hair and ears) significantly degrades the quality of the facial data and substantially reduces recognition accuracy for most commercial systems. For the highest-risk cases — subjects facing hostile government surveillance or sophisticated threat actors — a solid black redaction box is preferable, as it leaves no pixel data to analyze.
Yes. Blurify's exported images do not carry EXIF metadata from the original file. The output is a clean flat image. However, you should still strip metadata from the original source file before it enters your editing workflow, using a dedicated EXIF removal tool.
Yes. Blurify supports video files (MP4, WebM, MOV) in addition to images. You can draw blur or redaction shapes over faces and other identifying elements in video footage, and use the keyframe system to animate blur regions to track moving subjects. Video processing runs entirely in the browser using ffmpeg.wasm.
Voice identification is a separate and equally important concern for broadcast journalists. Pitch-shifting and voice modulation tools are commonly used to disguise interviewee voices. This is outside the scope of image anonymization but should be part of the same source-protection workflow for any multimedia content.
100% free · no sign-up
Blur faces, redact documents, and censor screenshots — all in your browser. Nothing is ever uploaded to our servers.
Open Blurify — it's free →