Why we must nurture positive ethics in “citizen-driven” OSINT

As citizen-driven open source intelligence (OSINT) grows in popularity, so does the risk of techniques being used by bad actors. I outline why digital investigators must promote a firmly ethical, but positive culture around our skillset.

Tom Jarvis
6 min readJan 30, 2022

For the past few years, the community of open source investigators comprising journalists, human rights advocates, and digital sleuths has expanded significantly. Notable outlets such as Bellingcat have brought tales of groundbreaking OSINT and OSI (open source investigation) to mainstream headlines and features.

Case studies, such as the MH17 investigation or the coverage of mass detention in Xinjiang, have inspired a new wave of amateur OSINT investigators to take up arms against injustice by exploiting an online world of information chaos.

Here, I will refer to these people as citizen OSINT’ers — people who aren’t classically trained journalists or analysts, often using the tools without sufficient ethical foundations.

But with this growth in popularity, comes a side that must be tamed. Like with any rapidly growing sector, there are teething issues that must be eradicated systemically and culturally. Within the sleuthing world, this resonates especially true following the false accusation of an individual wrongly linked to the Boston Bombing.

Search volume for “OSINT” on Google has been steadily increasing since roughly 2015. Image: Google Trends

When I demonstrate to friends the OSINT techniques I use, I often compare them to a much darker skillset. Cyber-stalking. To say this certainly risks giving the field a bad name but I do it because there is value in the comparison.

Showing people how easy it can be to obtain information on a target informs people just how vulnerable their data is. A simple screenshot of your Strava run or Uber fare posted online can reveal your common routes and home address. People don’t always realise this, nor do they realise that, with a bit of extra legwork, less conspicuous clues can reveal personal data too.

While we can teach people to be more secure with their data — even from threats who can geolocate someone from their kitchen window view — we cannot stop OSINT tools from reaching the wrong people.

A simple outfit pose at your front door, can easily reveal your address when combined with other information on your social media and a few hours on Google Maps Street View. Image: freepik — www.freepik.com

Two threat categories come immediately to mind when considering how citizen OSINT can fall foul of ethical practices: petty bad actors, and the inept do-gooder.

Regarding the former, a cyber-stalker or criminal can exploit open OS data for bad. It could be an abusive family member gaining information on their victims’ whereabouts; perhaps a burglar analysing someone’s daily routines to find a literal and figurative window to enter.

These exploitations are nothing new but they gain additional potency when increasing documentation and digital toolkits are distributed in the public domain.

But, I argue, the solution is not to slam shut the door and revert OSI back to a more closed community. Doing so would a) not work because there will always be people ready to disseminate skills, and b) sever the much-needed crowdsourcing potential of large-scale volunteer human rights intelligence projects.

Additionally, it could push malicious techniques into private networks, secured with encryption and make the “baddies” harder to defend against.

If you need to convince someone of the digital threats we all face, showcasing how easily solved the seemingly impossible OSINT challenges are is a great eye-opener.

Instead what the community of digital investigators must nurture is a culture of sharing this skillset with everyone. Promote digital vigilance and showcase your skills to help peers see how revealing digital information can be.

Demonstrating how easily cyber-stalking can be conducted (while having legitimate work to demonstrate your benign intent) will help inform people of the threats arising from the digital age — just do it in a way to promote defensive strategies rather than creating a Stalker101 masterclass blog.

The second threat, and one which fills me with dread, is the inept do-gooder — refer back to the Reddit Boston Bombing fiasco if you need convincing. As a journalist, ethics is drilled into you from day one. Equally important to accuracy is source protection.

Dramatised scene from US TV series The Newsroom depicting the events of the Reddit Boston Bombing Fiasco

I remember one of the first lessons in my Journalism degree emphasised that a journalist should be prepared to go to jail in order to protect a source. This is not only for the source’s safety but for the continued offer that anonymity can be guaranteed in return for information. Without this foundation, there can be no free press, nor trusted exchange of information.

While inaccurate conclusions from bad research can be retracted or clarified, a source’s anonymity or privacy cannot be reinstated once compromised.

In my opinion, it is the duty of every researcher to consider ethics before proceeding with publishing information. Likewise, ethics must be built into the methodology of any collaborative projects, especially with a mixed-skill contributor base.

When I founded the Tibet Research Project, a collaborative crowd-sourced OSINT project investigating the detention system in Tibet, my biggest fear was that our use of information would indirectly cause harm to individuals under oppression.

When planning the rules of the project based on our capabilities to ensure safety, I decided that there was significant vulnerability if we used human sources, not just from counter-investigations, but also from possible complacency by anyone who contributed. This meant very strict restrictions on how we used user-generated digital information to prevent unwitting sources from breaching China’s national security laws.

I was fortunate to have had an extremely skilled pool of contributors on that project, but would I have changed that rule looking back? No.

Another underpinning of ethical research is proportionality. I firmly believe the people whose data you process for an investigation have a right to privacy, even if they have posted vulnerabilities online. That isn’t to say analysing personal open source information is inherently unethical, but we must encourage these considerations amongst those who exploit it.

If you are investigating a long-estranged individual for corruption, it would not be proportional nor ethical to geolocate the addresses of their entire extended family. Likewise, a case must be made for the rights of children or vulnerable individuals who are not expected to realise the significance of the information they generate and publish.

Consider when scraping Facebook posts. If we refer to the intelligence cycle, I would argue that every stage (except planning and direction) would constitute a breach of privacy and be unethical unless it was necessitated by operational needs.

The intelligence cycle is an excellent framework for investigations and one in which I believe everyone should explore the ethical pitfalls of each stage.

In terms of the collection phase, it could be argued that posts were not submitted to the platform with permission for third-party collection. Even if you disagree with this, what follows must be dealt with sensitively.

As soon as we proceed to the exploitation stage, the privacy concerns are more apparent — people almost certainly do not consent to a detailed forensic analysis of their posts or images — even if that is something which anyone could theoretically do.

These examples barely scratch the surface of some of the issues the field will face in the coming years. As it grows more mainstream, I envision social media sites will write new policies on data protection, coupled with legislative changes on national levels. One simply needs to look at Michael Bazzel’s Open Source Intelligence Techniques textbook to see frequent amendments and new editions as the digital landscape rapidly evolves.

To ensure that OSI remains a positive culture with real-world benefits, everybody would be wise to play their part. Call out malpractice and encourage learning, ethics, and integrity.

There are so many resources which are, in my opinion, vital guides to ensure quality-driven, ethical research.

One of my top book recommendations which covers this area in detail is “Digital Witness: Using Open Source Information for Human Rights Investigation, Documentation, and Accountability”. This has been a constant companion by my side and a valuable reference resource, not just for the ethical dilemmas one may encounter, but also for correct procedures to handle information in a valid and trustworthy manner.

Another fantastic guide is the Berkeley Protocol, written with the intent of standardising OSI practices when investigating violations of international criminal, human rights, and humanitarian law.

--

--

Tom Jarvis
Tom Jarvis

Written by Tom Jarvis

OSINT Consultant and giant big huge nerd

No responses yet