Not long ago, the ability to digitally track someone’s daily movements just by knowing their home address, employer, or place of worship was considered a dangerous power that should remain only within the purview of nation states. But a new lawsuit in a likely constitutional battle over a New Jersey privacy law shows that anyone can now access this capability, thanks to a proliferation of commercial services that hoover up the digital exhaust emitted by widely-used mobile apps and websites.
Image: Shutterstock, Arthimides.
Delaware-based Atlas Data Privacy Corp. helps its users remove their personal information from the clutches of consumer data brokers, and from people-search services online. Backed by millions of dollars in litigation financing, Atlas so far this year has sued 151 consumer data brokers on behalf of a class that includes more than 20,000 New Jersey law enforcement officers who are signed up for Atlas services.
Atlas alleges all of these data brokers have ignored repeated warnings that they are violating Daniel’s Law, a New Jersey statute allowing law enforcement, government personnel, judges and their families to have their information completely removed from commercial data brokers. Daniel’s Law was passed in 2020 after the death of 20-year-old Daniel Anderl, who was killed in a violent attack targeting a federal judge — his mother.
Last week, Atlas invoked Daniel’s Law in a lawsuit (PDF) against Babel Street, a little-known technology company incorporated in Reston, Va. Babel Street’s core product allows customers to draw a digital polygon around nearly any location on a map of the world, and view a slightly dated (by a few days) time-lapse history of the mobile devices seen coming in and out of the specified area.
Babel Street’s LocateX platform also allows customers to track individual mobile users by their Mobile Advertising ID or MAID, a unique, alphanumeric identifier built into all Google Android and Apple mobile devices.
Babel Street can offer this tracking capability by consuming location data and other identifying information that is collected by many websites and broadcast to dozens and sometimes hundreds of ad networks that may wish to bid on showing their ad to a particular user.
This image, taken from a video recording Atlas made of its private investigator using Babel Street to show all of the unique mobile IDs seen over time at a mosque in Dearborn, Michigan. Each red dot represents one mobile device.
In an interview, Atlas said a private investigator they hired was offered a free trial of Babel Street, which the investigator was able to use to determine the home address and daily movements of mobile devices belonging to multiple New Jersey police officers whose families have already faced significant harassment and death threats.
Atlas said the investigator encountered Babel Street while testing hundreds of data broker tools and services to see if personal information on its users was being sold. They soon discovered Babel Street also bundles people-search services with its platform, to make it easier for customers to zero in on a specific device.
The investigator contacted Babel Street about possibly buying home addresses in certain areas of New Jersey. After listening to a sales pitch for Babel Street and expressing interest, the investigator was told Babel Street only offers their service to the government or to “contractors of the government.”
“The investigator (truthfully) mentioned that he was contemplating some government contract work in the future and was told by the Babel Street salesperson that ‘that’s good enough’ and that ‘they don’t actually check,’” Atlas shared in an email with reporters.
KrebsOnSecurity was one of five media outlets invited to review screen recordings that Atlas made while its investigator used a two-week trial version of Babel Street’s LocateX service. References and links to reporting by other publications, including 404 Media, Haaretz, NOTUS, and The New York Times, will appear throughout this story.
Collectively, these stories expose how the broad availability of mobile advertising data has created a market in which virtually anyone can build a sophisticated spying apparatus capable of tracking the daily movements of hundreds of millions of people globally.
The findings outlined in Atlas’s lawsuit against Babel Street also illustrate how mobile location data is set to massively complicate several hot-button issues, from the tracking of suspected illegal immigrants or women seeking abortions, to harassing public servants who are already in the crosshairs over baseless conspiracy theories and increasingly hostile political rhetoric against government employees.
Atlas says the Babel Street trial period allowed its investigator to find information about visitors to high-risk targets such as mosques, synagogues, courtrooms and abortion clinics. In one video, an Atlas investigator showed how they isolated mobile devices seen in a New Jersey courtroom parking lot that was reserved for jurors, and then tracked one likely juror’s phone to their home address over several days.
While the Atlas investigator had access to its trial account at Babel Street, they were able to successfully track devices belonging to several plaintiffs named or referenced in the lawsuit. They did so by drawing a digital polygon around the home address or workplace of each person in Babel Street’s platform, which focused exclusively on the devices that passed through those addresses each day.
Each red dot in this Babel Street map represents a unique mobile device that has been seen since April 2022 at a Jewish synagogue in Los Angeles, Calif. Image: Atlas Data Privacy Corp.
One unique feature of Babel Street is the ability to toggle a “night” mode, which makes it relatively easy to determine within a few meters where a target typically lays their head each night (because their phone is usually not far away).
Atlas plaintiffs Scott and Justyna Maloney are both veteran officers with the Rahway, NJ police department who live together with their two young children. In April 2023, Scott and Justyna became the target of intense harassment and death threats after Officer Justyna responded to a routine call about a man filming people outside of the Motor Vehicle Commission in Rahway.
The man filming the Motor Vehicle Commission that day is a social media personality who often solicits police contact and then records himself arguing about constitutional rights with the responding officers.
Officer Justyna’s interaction with the man was entirely peaceful, and the episode appeared to end without incident. But after a selectively edited video of that encounter went viral, their home address and unpublished phone numbers were posted online. When their tormentors figured out that Scott was also a cop (a sergeant), the couple began receiving dozens of threatening text messages, including specific death threats.
According to the Atlas lawsuit, one of the messages to Mr. Maloney demanded money, and warned that his family would “pay in blood” if he didn’t comply. Sgt. Maloney said he then received a video in which a masked individual pointed a rifle at the camera and told him that his family was “going to get [their] heads cut off.”
Maloney said a few weeks later, one of their neighbors saw two suspicious individuals in ski masks parked one block away from the home and alerted police. Atlas’s complaint says video surveillance from neighboring homes shows the masked individuals circling the Maloney’s home. The responding officers arrested two men, who were armed, for unlawful possession of a firearm.
According to Google Maps, Babel Street shares a corporate address with Google and the consumer credit reporting bureau TransUnion.
Atlas said their investigator was not able to conclusively find Scott Maloney’s iPhone in the Babel Street platform, but they did find Justyna’s. Babel Street had nearly 100,000 hits for her phone over several months, allowing Atlas to piece together an intimate picture of Justyna’s daily movements and meetings with others.
An Atlas investigator visited the Maloneys and inspected Justyna’s iPhone, and determined the only app that used her device’s location data was from the department store Macy’s.
In a written response to questions, Macy’s said its app includes an opt-in feature for geo-location, “which allows customers to receive an enhanced shopping experience based on their location.”
“We do not store any customer location information,” Macy’s wrote. “We share geo-location data with a limited number of partners who help us deliver this enhanced app experience. Furthermore, we have no connection with Babel Street” [link added for context].
Justyna’s experience highlights a stark reality about the broad availability of mobile location data: Even if the person you’re looking for isn’t directly identifiable in platforms like Babel Street, it is likely that at least some of that person’s family members are. In other words, it’s often trivial to infer the location of one device by successfully locating another.
The terms of service for Babel Street’s Locate X service state that the product “may not be used as the basis for any legal process in any country, including as the basis for a warrant, subpoena, or any other legal or administrative action.” But Scott Maloney said he’s convinced by their experience that not even law enforcement agencies should have access to this capability without a warrant.
“As a law enforcement officer, in order for me to track someone I need a judge to sign a warrant – and that’s for a criminal investigation after we’ve developed probable cause,” Mr. Maloney said in an interview. “Data brokers tracking me and my family just to sell that information for profit, without our consent, and even after we’ve explicitly asked them not to is deeply disturbing.”
Mr. Maloney’s law enforcement colleagues in other states may see things differently. In August, The Texas Observer reported that state police plan to spend more than $5 million on a contract for a controversial surveillance tool called Tangles from the tech firm PenLink. Tangles is an AI-based web platform that scrapes information from the open, deep and dark web, and it has a premier feature called WebLoc that can be used to geofence mobile devices.
The Associated Press reported last month that law enforcement agencies from suburban Southern California to rural North Carolina have been using an obscure cell phone tracking tool called Fog Reveal — at times without warrants — that gives them the ability to follow people’s movements going back many months.
It remains unclear precisely how Babel Street is obtaining the abundance of mobile location data made available to users of its platform. The company did not respond to multiple requests for comment.
But according to a document (PDF) obtained under a Freedom of Information Act request with the Department of Homeland Security’s Science and Technology directorate, Babel Street re-hosts data from the commercial phone tracking firm Venntel.
On Monday, the Substack newsletter All-Source Intelligence unearthed documents indicating that the U.S. Federal Trade Commission has opened an inquiry into Venntel and its parent company Gravy Analytics.
“Venntel has also been a data partner of the police surveillance contractor Fog Data Science, whose product has been described as ‘mass surveillance on a budget,'” All-Source’s Jack Poulson wrote. “Venntel was also reported to have been a primary data source of the controversial ‘Locate X’ phone tracking product of the American data fusion company Babel Street.”
The Mobile Advertising ID or MAID — the unique alphanumeric identifier assigned to each mobile device — was originally envisioned as a way to distinguish individual mobile customers without relying on personally identifiable information such as phone numbers or email addresses.
However, there is now a robust industry of marketing and advertising companies that specialize in assembling enormous lists of MAIDs that are “enriched” with historical and personal information about the individual behind each MAID.
One of many vendors that “enrich” MAID data with other identifying information, including name, address, email address and phone number.
Atlas said its investigator wanted to know whether they could find enriched MAID records on their New Jersey law enforcement customers, and soon found plenty of ad data brokers willing to sell it.
Some vendors offered only a handful of data fields, such as first and last name, MAID and email address. Other brokers sold far more detailed histories along with their MAID, including each subject’s social media profiles, precise GPS coordinates, and even likely consumer category.
How are advertisers and data brokers gaining access to so much information? Some sources of MAID data can be apps on your phone such as AccuWeather, GasBuddy, Grindr, and MyFitnessPal that collect your MAID and location and sell that to brokers.
A user’s MAID profile and location data also is commonly shared as a consequence of simply using a smartphone to visit a web page that features ads. In the few milliseconds before those ads load, the website will send a “bid request” to various ad exchanges, where advertisers can bid on the chance to place their ad in front of users who match the consumer profiles they’re seeking. A great deal of data can be included in a bid request, including the user’s precise location (the current open standard for bid requests is detailed here).
The trouble is that virtually anyone can access the “bidstream” data flowing through these so-called “realtime bidding” networks, because the information is simultaneously broadcast in the clear to hundreds of entities around the world.
The result is that there are a number of marketing companies that now enrich and broker access to this mobile location information. Earlier this year, the German news outlet netzpolitik.org purchased a bidstream data set containing more than 3.6 billion data points, and shared the information with the German daily BR24. They concluded that the data they obtained (through a free trial, no less) made it possible to establish movement profiles — some of them quite precise — of several million people across Germany.
A screenshot from the BR24/Netzpolitik story about their ability to track millions of Germans, including many employees of the German Federal Police and Interior Ministry.
Politico recently covered startling research from universities in New Hampshire, Kentucky and St. Louis that showed how the mobile advertising data they acquired allowed them to link visits from investigators with the U.S. Securities and Exchange Commission (SEC) to insiders selling stock before the investigations became public knowledge.
The researchers in that study said they didn’t attempt to use the same methods to track regulators from other agencies, but that virtually anyone could do it.
Justin Sherman, a distinguished fellow at Georgetown Law’s Center for Privacy and Technology, called the research a “shocking demonstration of what happens when companies can freely harvest Americans’ geolocation data and sell it for their chosen price.”
“Politicians should understand how they, their staff, and public servants are threatened by the sale of personal data—and constituent groups should realize that talk of data broker ‘controls’ or ‘best practices” is designed by companies to distract from the underlying problems and the comprehensive privacy and security solutions,” Sherman wrote for Lawfare this week.
The Orwellian nature of modern mobile advertising networks may soon have far-reaching implications for women’s reproductive rights, as more states move to outlaw abortion within their borders. The 2022 Dobbs decision by the U.S. Supreme Court discarded the federal right to abortion, and 14 states have since enacted strict abortion bans.
Anti-abortion groups are already using mobile advertising data to advance their cause. In May 2023, The Wall Street Journal reported that an anti-abortion group in Wisconsin used precise geolocation data to direct ads to women it suspected of seeking abortions.
As it stands, there is little to stop anti-abortion groups from purchasing bidstream data (or renting access to a platform like Babel Street) and using it to geofence abortion clinics, potentially revealing all mobile devices transiting through these locations.
Atlas said its investigator geofenced an abortion clinic and was able to identify a likely employee at that clinic, following their daily route to and from that individual’s home address.
A still shot from a video Atlas shared of its use of Babel Street to identify and track an employee traveling each day between their home and the clinic.
Last year, Idaho became the first state to outlaw “abortion trafficking,” which the Idaho Capital Sun reports is defined as “recruiting, harboring or transporting a pregnant minor to get an abortion or abortion medication without parental permission.” Tennessee now has a similar law, and GOP lawmakers in five other states introduced abortion trafficking bills that failed to advance this year, the Sun reports.
Atlas said its investigator used Babel Street to identify and track a person traveling from their home in Alabama — where abortion is now illegal — to an abortion clinic just over the border in Tallahassee, Fla. — and back home again within a few hours. Abortion rights advocates and providers are currently suing Alabama Attorney General Steve Marshall, seeking to block him from prosecuting people who help patients travel out-of-state to end pregnancies.
Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation (EFF), a non-profit digital rights group, said she’s extremely concerned about dragnet surveillance of people crossing state lines in order to get abortions.
“Specifically, Republican officials from states that have outlawed abortion have made it clear that they are interested in targeting people who have gone to neighboring states in order to get abortions, and to make it more difficult for people who are seeking abortions to go to neighboring states,” Galperin said. “It’s not a great leap to imagine that states will do this.”
Atlas found that for the right price (typically $10-50k a year), brokers can provide access to tens of billions of data points covering large swaths of the US population and the rest of the world.
Based on the data sets Atlas acquired — many of which included older MAID records — they estimate they could locate roughly 80 percent of Android-based devices, and about 25 percent of Apple phones. Google refers to its MAID as the “Android Advertising ID,” (AAID) while Apple calls it the “Identifier for Advertisers” (IDFA).
What accounts for the disparity between the number of Android and Apple devices that can be found in mobile advertising data? In April 2021, Apple shipped version 14.5 of its iOS operating system, which introduced a technology called App Tracking Transparency (ATT) that requires apps to get affirmative consent before they can track users by their IDFA or any other identifier.
Apple’s introduction of ATT had a swift and profound impact on the advertising market: Less than a year later Facebook disclosed that the iPhone privacy feature would decrease the company’s 2022 revenues by about $10 billion.
Source: cnbc.com.
Google runs by far the world’s largest ad exchange, known as AdX. The U.S. Department of Justice, which has accused Google of building a monopoly over the technology that places ads on websites, estimates that Google’s ad exchange controls 47 percent of the U.S. market and 56 percent globally.
Google’s Android is also the dominant mobile operating system worldwide, with more than 72 percent of the market. In the U.S., however, iPhone users claim approximately 55 percent of the market, according to TechRepublic.
In response to requests for comment, Google said it does not send real time bidding requests to Babel Street, nor does it share precise location data in bid requests. The company added that its policies explicitly prohibit the sale of data from real-time bidding, or its use for any purpose other than advertising.
Google said its MAIDs are randomly generated and do not contain IP addresses, GPS coordinates, or any other location data, and that its ad systems do not share anyone’s precise location data.
“Android has clear controls for users to manage app access to device location, and reset or delete their advertising ID,” Google’s written statement reads. “If we learn that someone, whether an app developer, ad tech company or anyone else, is violating our policies, we take appropriate action. Beyond that, we support legislation and industry collaboration to address these types of data practices that negatively affect the entire mobile ecosystem, including all operating systems.”
In a written statement shared with reporters, Apple said Location Services is not on by default in its devices. Rather, users must enable Location Services and must give permission to each app or website to use location data. Users can turn Location Services off at any time, and can change whether apps have access to location at any time. The user’s choices include precise vs. approximate location, as well as a one-time grant of location access by the app.
“We believe that privacy is a fundamental human right, and build privacy protections into each of our products and services to put the user in control of their data,” an Apple spokesperson said. “We minimize personal data collection, and where possible, process data only on users’ devices.”
Zach Edwards is a senior threat analyst at the cybersecurity firm SilentPush who has studied the location data industry closely. Edwards said Google and Apple can’t keep pretending like the MAIDs being broadcast into the bidstream from hundreds of millions of American devices aren’t making most people trivially trackable.
“The privacy risks here will remain until Apple and Google permanently turn off their mobile advertising ID schemes and admit to the American public that this is the technology that has been supporting the global data broker ecosystem,” he said.
According to Bloomberg Law, between 2019 and 2023, threats against federal judges have more than doubled. Amid increasingly hostile political rhetoric and conspiracy theories against government officials, a growing number of states are seeking to pass their own versions of Daniel’s Law.
Last month, a retired West Virginia police officer filed a class action lawsuit against the people-search service Whitepages for listing their personal information in violation of a statute the state passed in 2021 that largely mirrors Daniel’s Law.
In May 2024, Maryland passed the Judge Andrew F. Wilkinson Judicial Security Act — named after a county circuit court judge who was murdered by an individual involved in a divorce proceeding over which he was presiding. The law allows current and former members of the Maryland judiciary to request their personal information not be made available to the public.
Under the Maryland law, personal information can include a home address; telephone number, email address; Social Security number or federal tax ID number; bank account or payment card number; a license plate or other unique vehicle identifier; a birth or marital record; a child’s name, school, or daycare; place of worship; place of employment for a spouse, child, or dependent.
The law firm Troutman Pepper writes that “so far in 2024, 37 states have begun considering or have adopted similar privacy-based legislation designed to protect members of the judiciary and, in some states, other government officials involved in law enforcement.”
Atlas alleges that in response to requests to have data on its New Jersey law enforcement clients scrubbed from consumer records sold by LexisNexis, the data broker retaliated by freezing the credit of approximately 18,500 people, and falsely reporting them as identity theft victims.
In addition, Atlas said LexisNexis started returning failure codes indicating they had no record of these individuals, resulting in denials when officers attempted to refinance loans or open new bank accounts.
The data broker industry has responded by having at least 70 of the Atlas lawsuits moved to federal court, and challenging the constitutionality of the New Jersey statute as overly broad and a violation of the First Amendment.
Attorneys for the data broker industry argued in their motion to dismiss that there is “no First Amendment doctrine that exempts a content-based restriction from strict scrutiny just because it has some nexus with a privacy interest.”
Atlas’s lawyers responded that data covered under Daniel’s Law — personal information of New Jersey law enforcement officers — is not free speech. Atlas notes that while defending against comparable lawsuits, the data broker industry has argued that home address and phone number data are not “communications.”
“Data brokers should not be allowed to argue that information like addresses are not ‘communications’ in one context, only to turn around and claim that addresses are protectable communications,” Atlas argued (PDF). “Nor can their change of course alter the reality that the data at issue is not speech.”
The judge overseeing the challenge is expected to rule on the motion to dismiss within the next few weeks. Regardless of the outcome, the decision is likely to be appealed all the way to the U.S. Supreme Court.
Meanwhile, media law experts say they’re concerned that enacting Daniel’s Law in other states could limit the ability of journalists to hold public officials accountable, and allow authorities to pursue criminal charges against media outlets that publish the same type of public and government records that fuel the people-search industry.
Sen. Ron Wyden (D-Ore.) said Congress’ failure to regulate data brokers, and the administration’s continued opposition to bipartisan legislation that would limit data sales to law enforcement, have created this current privacy crisis.
“Whether location data is being used to identify and expose closeted gay Americans, or to track people as they cross state lines to seek reproductive health care, data brokers are selling Americans’ deepest secrets and exposing them to serious harm, all for a few bucks,” Wyden said in a statement shared with KrebsOnSecurity, 404 Media, Haaretz, NOTUS, and The New York Times.
Sen. Wyden said Google also deserves blame for refusing to follow Apple’s lead by removing companies’ ability to track phones.
“Google’s insistence on uniquely tracking Android users – and allowing ad companies to do so as well – has created the technical foundations for the surveillance economy and the abuses stemming from it,” Wyden said.
Georgetown Law’s Justin Sherman said the data broker and mobile ad industries claim there are protections in place to anonymize mobile location data and restrict access to it, and that there are limits to the kinds of invasive inferences one can make from location data. The data broker industry also likes to tout the usefulness of mobile location data in fighting retail fraud, he said.
“All kinds of things can be inferred from this data, including people being targeted by abusers, or people with a particular health condition or religious belief,” Sherman said. “You can track jurors, law enforcement officers visiting the homes of suspects, or military intelligence people meeting with their contacts. The notion that the sale of all this data is preventing harm and fraud is hilarious in light of all the harm it causes enabling people to better target their cyber operations, or learning about people’s extramarital affairs and extorting public officials.”
Privacy experts say disabling or deleting your device’s MAID will have no effect on how your phone operates, except that you may begin to see far less targeted ads on that device.
Any Android apps with permission to use your location should appear when you navigate to the Settings app, Location, and then App Permissions. “Allowed all the time” is the most permissive setting, followed by “Allowed only while in use,” “Ask every time,” and “Not allowed.”
Android users can delete their ad ID permanently, by opening the Settings app and navigating to Privacy > Ads. Tap “Delete advertising ID,” then tap it again on the next page to confirm. According to the EFF, this will prevent any app on your phone from accessing the ad ID in the future. Google’s documentation on this is here.
Image: eff.org
By default, Apple’s iOS requires apps to ask permission before they can access your device’s IDFA. When you install a new app, it may ask for permission to track you. When prompted to do so by an app, select the “Ask App Not to Track” option. Apple users also can set the “Allow apps to request to track” switch to the “off” position, which will block apps from asking to track you.
Apple’s Privacy and Ad Tracking Settings.
Apple also has its own targeted advertising system which is separate from third-party tracking enabled by the IDFA. To disable it, go to Settings, Privacy, and Apple Advertising, and ensure that the “Personalized Ads” setting is set to “off.”
Finally, if you’re the type of reader who’s the default IT support person for a small group of family or friends (bless your heart), it would be a good idea to set their devices not to track them, and to disable any apps that may have location data sharing turned on 24/7.
There is a dual benefit to this altruism, which is clearly in the device owner’s best interests. Because while your device may not be directly trackable via advertising data, making sure they’re opted out of said tracking also can reduce the likelihood that you are trackable simply by being physically close to those who are.
Free to use IOC feed for various tools/malware. It started out for just C2 tools but has morphed into tracking infostealers and botnets as well. It uses shodan.io/">Shodan searches to collect the IPs. The most recent collection is always stored in data
; the IPs are broken down by tool and there is an all.txt
.
The feed should update daily. Actively working on making the backend more reliable
Many of the Shodan queries have been sourced from other CTI researchers:
Huge shoutout to them!
Thanks to BertJanCyber for creating the KQL query for ingesting this feed
And finally, thanks to Y_nexro for creating C2Live in order to visualize the data
If you want to host a private version, put your Shodan API key in an environment variable called SHODAN_API_KEY
echo SHODAN_API_KEY=API_KEY >> ~/.bashrc
bash
python3 -m pip install -r requirements.txt
python3 tracker.py
I encourage opening an issue/PR if you know of any additional Shodan searches for identifying adversary infrastructure. I will not set any hard guidelines around what can be submitted, just know, fidelity is paramount (high true/false positive ratio is the focus).
CureIAM is an easy-to-use, reliable, and performant engine for Least Privilege Principle Enforcement on GCP cloud infra. It enables DevOps and Security team to quickly clean up accounts in GCP infra that have granted permissions of more than what are required. CureIAM fetches the recommendations and insights from GCP IAM recommender, scores them and enforce those recommendations automatically on daily basic. It takes care of scheduling and all other aspects of running these enforcement jobs at scale. It is built on top of GCP IAM recommender APIs and Cloudmarker framework.
Discover what makes CureIAM scalable and production grade.
safe_to_apply_score
, risk_score
, over_privilege_score
. Each score serves a different purpose. For safe_to_apply_score
identifies the capability to apply recommendation on automated basis, based on the threshold set in CureIAM.yaml
config file.Since CureIAM is built with python, you can run it locally with these commands. Before running make sure to have a configuration file ready in either of /etc/CureIAM.yaml
, ~/.CureIAM.yaml
, ~/CureIAM.yaml
, or CureIAM.yaml
and there is Service account JSON file present in current directory with name preferably cureiamSA.json
. This SA private key can be named anything, but for docker image build, it is preferred to use this name. Make you to reference this file in config for GCP cloud.
# Install necessary dependencies
$ pip install -r requirements.txt
# Run CureIAM now
$ python -m CureIAM -n
# Run CureIAM process as schedular
$ python -m CureIAM
# Check CureIAM help
$ python -m CureIAM --help
CureIAM can be also run inside a docker environment, this is completely optional and can be used for CI/CD with K8s cluster deployment.
# Build docker image from dockerfile
$ docker build -t cureiam .
# Run the image, as schedular
$ docker run -d cureiam
# Run the image now
$ docker run -f cureiam -m cureiam -n
CureIAM.yaml
configuration file is the heart of CureIAM engine. Everything that engine does it does it based on the pipeline configured in this config file. Let's break this down in different sections to make this config look simpler.
logger:
version: 1
disable_existing_loggers: false
formatters:
verysimple:
format: >-
[%(process)s]
%(name)s:%(lineno)d - %(message)s
datefmt: "%Y-%m-%d %H:%M:%S"
handlers:
rich_console:
class: rich.logging.RichHandler
formatter: verysimple
file:
class: logging.handlers.TimedRotatingFileHandler
formatter: simple
filename: /tmp/CureIAM.log
when: midnight
encoding: utf8
backupCount: 5
loggers:
adal-python:
level: INFO
root:
level: INFO
handlers:
- rich_console
- file
schedule: "16:00"
This subsection of config uses, Rich
logging module and schedules CureIAM to run daily at 16:00
.
plugins
section in CureIAM.yaml
. You can think of this section as declaration for different plugins. plugins:
gcpCloud:
plugin: CureIAM.plugins.gcp.gcpcloud.GCPCloudIAMRecommendations
params:
key_file_path: cureiamSA.json
filestore:
plugin: CureIAM.plugins.files.filestore.FileStore
gcpIamProcessor:
plugin: CureIAM.plugins.gcp.gcpcloudiam.GCPIAMRecommendationProcessor
params:
mode_scan: true
mode_enforce: true
enforcer:
key_file_path: cureiamSA.json
allowlist_projects:
- alpha
blocklist_projects:
- beta
blocklist_accounts:
- foo@bar.com
allowlist_account_types:
- user
- group
- serviceAccount
blocklist_account_types:
- None
min_safe_to_apply_score_user: 0
min_safe_to_apply_scor e_group: 0
min_safe_to_apply_score_SA: 50
esstore:
plugin: CureIAM.plugins.elastic.esstore.EsStore
params:
# Change http to https later if your elastic are using https
scheme: http
host: es-host.com
port: 9200
index: cureiam-stg
username: security
password: securepassword
Each of these plugins declaration has to be of this form:
plugins:
<plugin-name>:
plugin: <class-name-as-python-path>
params:
param1: val1
param2: val2
For example, for plugins CureIAM.stores.esstore.EsStore
which is this file and class EsStore
. All the params which are defined in yaml has to match the declaration in __init__()
function of the same plugin class.
audits:
IAMAudit:
clouds:
- gcpCloud
processors:
- gcpIamProcessor
stores:
- filestore
- esstore
Multiple Audits can be created out of this. The one created here is named IAMAudit
with three plugins in use, gcpCloud
, gcpIamProcessor
, filestores
and esstore
. Note these are the same plugin names defined in Step 2. Again this is like defining the pipeline, not actually running it. It will be considered for running with definition in next step.
CureIAM
to run the Audits defined in previous step. run:
- IAMAudits
And this makes the entire configuration for CureIAM, you can find the full sample here, this config driven pipeline concept is inherited from Cloudmarker framework.
The JSON which is indexed in elasticsearch using Elasticsearch store plugin, can be used to generate dashboard in Kibana.
[Please do!] We are looking for any kind of contribution to improve CureIAM's core funtionality and documentation. When in doubt, make a PR!
Gojek Product Security Team
<>
=============
Adding the version in library to avoid any back compatibility issues.
Running docker compose: docker-compose -f docker_compose_es.yaml up
mode_scan: true
mode_enforce: false
HardHat is a multiplayer C# .NET-based command and control framework. Designed to aid in red team engagements and penetration testing. HardHat aims to improve the quality of life factors during engagements by providing an easy-to-use but still robust C2 framework.
It contains three primary components, an ASP.NET teamserver, a blazor .NET client, and C# based implants.
Alpha Release - 3/29/23 NOTE: HardHat is in Alpha release; it will have bugs, missing features, and unexpected things will happen. Thank you for trying it, and please report back any issues or missing features so they can be addressed.
Discord Join the community to talk about HardHat C2, Programming, Red teaming and general cyber security things The discord community is also a great way to request help, submit new features, stay up to date on the latest additions, and submit bugs.
documentation can be found at docs
To configure the team server's starting address (where clients will connect), edit the HardHatC2\TeamServer\Properties\LaunchSettings.json changing the "applicationUrl": "https://127.0.0.1:5000" to the desired location and port. start the teamserver with dotnet run from its top-level folder ../HrdHatC2/Teamserver/
Code contributions are welcome feel free to submit feature requests, pull requests or send me your ideas on discord.
Do you remember the days of printing out directions from your desktop? Or the times when passengers were navigation co-pilots armed with a 10-pound book of maps? You can thank location services on your smartphone for today’s hassle-free and paperless way of getting around town and exploring exciting new places.
However, location services can prove a hassle to your online privacy when you enable location sharing. Location sharing is a feature on many connected devices – smartphones, tablets, digital cameras, smart fitness watches – that pinpoints your exact location and then distributes your coordinates to online advertisers, your social media following, or strangers.
While there are certain scenarios where sharing your location is a safety measure, in most cases, it’s an online safety hazard. Here’s what you should know about location sharing and the effects it has on your privacy.
Location sharing is most beneficial when you’re unsure about new surroundings and want to let your loved ones know that you’re ok. For example, if you’re traveling by yourself, it may be a good idea to share the location of your smartphone with an emergency contact. That way, if circumstances cause you to deviate from your itinerary, your designated loved one can reach out and ensure your personal safety.
The key to sharing your location safely is to only allow your most trusted loved one to track the whereabouts of you and your connected device. Once you’re back on known territory, you may want to consider turning off all location services, since it presents a few security and privacy risks.
In just about every other case, you should definitely think twice about enabling location sharing on your smartphone. Here are three risks it poses to your online privacy and possibly your real-life personal safety:
Does it sometimes seem like your phone, tablet, or laptop is listening to your conversations? Are the ads you get in your social media feeds or during ad breaks in your gaming apps a little too accurate? When ad tracking is enabled on your phone, it allows online advertisers to collect your personal data that you add to your various online accounts to better predict what ads you might like. Personal details may include your full name, birthday, address, income, and, thanks to location tracking, your hometown and regular neighborhood haunts.
If advertisers kept these details to themselves, it may just seem like a creepy invasion of privacy; however, data brokerage sites may sell your personally identifiable information (PII) to anyone, including cybercriminals. The average person has their PII for sale on more than 30 sites and 98% of people never gave their permission to have their information sold online. Yet, data brokerage sites are legal.
One way to keep your data out of the hands of advertisers and cybercriminals is to limit the amount of data you share online and to regularly erase your data from brokerage sites. First, turn off location services and disable ad tracking on all your apps. Then, consider signing up for McAfee Personal Data Cleanup, which scans, removes, and monitors data brokerage sites for your personal details, thus better preserving your online privacy.
Location sharing may present a threat to your personal safety. Stalkers could be someone you know or a stranger. Fitness watches that connect to apps that share your outdoor exercising routes could be especially risky, since over time you’re likely to reveal patterns of the times and locations where one could expect to run into you.
Additionally, stalkers may find you through your geotagged social media posts. Geotagging is a social media feature that adds the location to your posts. Live updates, like live tweeting or real-time Instagram stories, can pinpoint your location accurately and thus alert someone on where to find you.
Social engineering is an online scheme where cybercriminals learn all there is about you from your social media accounts and then use that information to impersonate you or to tailor a scam to your interests. Geotagged photos and posts can tell a scammer a lot about you: your hometown, your school or workplace, your favorite café, etc.
With these details, a social engineer could fabricate a fundraiser for your town, for example. Social engineers are notorious for evoking strong emotions in their pleas for funds, so beware of any direct messages you receive that make you feel very angry or very sad. With the help of ChatGPT, social engineering schemes are likely going to sound more believable than ever before. Slow down and conduct your own research before divulging any personal or payment details to anyone you’ve never met in person.
Overall, it’s best to live online as anonymously as possible, which includes turning off your location services when you feel safe in your surroundings. McAfee+ offers several features to improve your online privacy, such as a VPN, Personal Data Cleanup, and Online Account Cleanup.
The post 3 Reasons to Think Twice About Enabling Location Sharing appeared first on McAfee Blog.
We all know that our phones know a lot about us. And they most certainly know a lot about where we go, thanks to the several ways they can track our location.
Location tracking on your phone offers plenty of benefits, such as with apps that can recommend a good restaurant nearby, serve up the weather report for your exact location, or connect you with singles for dating in your area. Yet the apps that use location tracking may do more with your location data than that. They may collect it, and in turn sell it to advertisers and potentially other third parties that have an interest in where you go and what you do.
Likewise, cell phone providers have other means of collecting location information from your phone, which they may use for advertising and other purposes as well.
If that sounds like more than you’re willing to share, know that you can do several things that can limit location tracking on your phone—and thus limit the information that can potentially end up in other people’s hands.
As we look at the ways you can limit location tracking on your phone, it helps to know the basics of how smartphones can track your movements.
For starters, outside of shutting down your phone completely, your phone can be used to determine your location to varying degrees of accuracy depending on the method used:
Now here’s what makes these tracking methods so powerful: in addition to the way they can determine your phone’s location, they’re also quite good at determining your identity too. With it, companies know who you are, where you are, and potentially some idea of what you’re doing there based on your phone’s activity.
Throughout our blogs we refer to someone’s identity as a jigsaw puzzle. Some pieces are larger than others, like your Social Security number or tax ID number being among the biggest because they are so unique. Yet if someone gathers enough of those smaller pieces, they can put those pieces together and identify you.
Things like your phone’s MAC address, ad IDs, IP address, device profile, and other identifiers are examples of those smaller pieces, all of which can get collected. In the hands of the collector, they can potentially create a picture of who you are and where you’ve been.
What happens to your data largely depends on what you’ve agreed to.
In terms of apps, we’ve all seen the lengthy user agreements that we click on during the app installation process. Buried within them are terms put forth by the app developer that cover what data the app collects, how it’s used, and if it may be shared with or sold to third parties. Also, during the installation process, the app may ask for permissions to access certain things on your phone, like photos, your camera, and yes, location services so it can track you. When you click “I Agree,” you indeed agree to all those terms and permissions.
Needless to say, some apps only use and collect the bare minimum of information as part of the agreement. On the other end of the spectrum, some apps will take all they can get and then sell the information they collect to third parties, such as data brokers that build exacting profiles of individuals, their histories, their interests, and their habits.
In turn, those data brokers will sell that information to anyone, which can be used by advertisers along with identity thieves, scammers, and spammers. And as reported in recent years, various law enforcement agencies will purchase that information as well for surveillance purposes.
Further, some apps are malicious from the start. Google Play does its part to keep its virtual shelves free of malware-laden apps with a thorough submission process as reported by Google and through its App Defense Alliance that shares intelligence across a network of partners, of which we’re a proud member. Android users also have the option of running Play Protect to check apps for safety before they’re downloaded. Apple has its own rigorous submission process for weeding out fraud and malicious apps in its store as well.
Yet, bad actors find ways to sneak malware into app stores. Sometimes they upload an app that’s initially clean and then push the malware to users as part of an update. Other times, they’ll embed the malicious code so that it only triggers once it’s run in certain countries. They will also encrypt malicious code in the app that they submit, which can make it difficult for reviewers to sniff out. These apps will often steal data, and are designed to do so, including location information in some cases.
As far as cell phone service providers go, they have legitimate reasons for tracking your phone in the ways mentioned above. One is for providing connectivity to emergency service calls (again, like 911 in the U.S.), yet others are for troubleshooting and to ensure that only legitimate customers are accessing their network. And, depending on the carrier, they may use it for advertising purposes in programs that you may willingly opt into or that you must intentionally opt out of.
We each have our own comfort level when it comes to our privacy. For some, personalized ads have a certain appeal. For others, not so much, not when it involves sharing information about themselves. Yet arguably, some issues of privacy aren’t up for discussion, like ending up with a malicious data-stealing app on your phone.
In all, you can take several steps to limit tracking on your smartphone to various degrees—and boost your privacy to various degrees as a result:
There’s no way around it. Using a smartphone puts you on the map. And to some extent, what you’re doing there as well. Outside of shutting down your phone or popping into Airplane Mode (noting what we said about iPhones and their “Find My Network” functionality above), you have no way of preventing location tracking. You can most certainly limit it.
For yet more ways you can lock down your privacy and your security on your phone, online protection software can help. Our McAfee+ plans protect you against identity theft, online scams, and other mobile threats—including credit card and bank fraud, emerging viruses, malicious texts and QR codes. For anyone who spends a good portion of their day on their phone, this kind of protection can make life far safer given all the things they do and keep on there.
The post How to Limit Location Tracking on Your Phone appeared first on McAfee Blog.