The Case for App Scanning and SDK Governance: Lessons from Texas Lawsuit The State of Texas has filed a lawsuit against a large insurance company and its analytics subsidiary for alleged violations of the Texas Data Privacy and Security Act (TDPSA), the Data Broker Law, and the Texas Insurance Code. What happened: - A large insurance company and its analytics subsidiary created a Software Development Kit (SDK), that was embedded into third-party apps offering location-based services. - This SDK secretly collected sensitive user data, including precise locations, speed, direction, and other phone sensor data, without users' awareness. - The collected data was used to create a massive driving behaviour database covering millions of users. - This data was monetized, influencing insurance premiums and policies, often without users' knowledge or consent. - Users were not informed about how their data was being collected or shared, and privacy policies were not clear or accessible. Key issues: 1) No user consent: People did not know their data was being collected or sold. 2) Inaccurate profiling: The SDK often mistook passengers or other scenarios as "bad driving," leading to misleading profiles. 3 ) Non-compliance: The analytics subsidiary failed to register as a data broker, as required by Texas law. Why this matters: This case highlights the risks of hidden data collection in apps. It shows how companies can misuse sensitive data and the importance of protecting user privacy through stronger controls. The way forward: To effectively address these risks, organizations must take assertive action by implementing the following measures - a) Conduct regular mobile app scanning: Analyze apps weekly or bi-weekly to identify permissions, embedded SDKs, and dataflows. b) Govern SDKs effectively: Establish strict policies for integrating and monitoring SDKs. Require transparency from SDK providers about what data is collected, how it is used, and who it is shared with. Avoid SDKs that fail to meet these standards. c) Monitor hidden dataflows: SDKs often operate in the background and can rely on permissions obtained by the app to collect sensitive data. Regularly audit these dataflows to uncover any implicit collection or sharing practices and address potential violations proactively. d) Communicate transparently with users: Update #privacy policies to clearly explain what data is collected, how it will be used, and who it will be shared with. Obtain explicit consent before collecting or sharing sensitive data. The risks of hidden #dataflows and implicit data collection are significant, especially as #SDKs become more complex. How frequently does your team #audit apps for SDK behaviors and permissions? What tools or strategies have you found most effective in uncovering hidden #datasharing?
Data Collection Deception
Explore top LinkedIn content from expert professionals.
Summary
Data collection deception refers to any practice where organizations or individuals gather personal data without fully informing users or obtaining proper consent, often hiding their true intentions or the extent of data use. This hidden or misleading data harvesting can put privacy at risk and expose individuals to scams or misuse of their information.
- Scrutinize consent requests: Always check what information an app, website, or survey is asking for and read privacy disclosures carefully before sharing any details.
- Audit your digital footprint: Regularly review which apps, services, or data broker sites might be collecting or sharing your personal information and consider removing unnecessary permissions or opting out where possible.
- Question unnecessary data asks: If a service or survey requests sensitive information that doesn't seem relevant, treat it with suspicion and avoid providing more than needed.
-
-
Warning to all LinkedIn users: You receive a message from a person supposedly connected to a reputable survey company such as IDR. The incoming email address or user looks legit. The message is addressed to you personally e.g.”Dear Clive…..” The message invites you to participate in a compensated survey, (e.g. for $30 or $50) related to something related to your industry (e.g. “a study focused on ETF usage and selection practices among financial professionals across Europe”.) The message says “If you qualify based on a short screening form, you’ll be invited to complete a brief survey.” and contains a link to an external form, (e.g. in a Microsoft Forms link). This simple “screening” appears innocuous but you are still asked to share potentially sensitive information such as your name, email address, LinkedIn profile and place of work. No matter what rubbish you write in the screening, you’ll automatically qualify for the survey. The subsequent “survey” starts with some innocuous questions before seeking answers to more sensitive questions such as: Personal Identifiers: The form often asks for full legal name, business email, and phone number. Professional Information: Common questions include employer/company name, job title and role. (E.g. “Please briefly describe the focus of your role and main responsibilities.”). You may be asked to provide other sensitive information such as type of clients, or assets under management. This level of data collection is unnecessary for market research and is a typical hallmark of scams. You might receive a follow-up call in which you are asked for your bank account details or PayPal information “for payment” and asked for a copy of your photo ID, business card, or proof of employment to confirm your identity or eligibility for the honorarium. Providing any information opens risks of identity theft, phishing, social engineering, account compromise, and financial fraud. Do not submit any sensitive information through these forms. Even initial “screening” steps are often used to harvest professional and contact data for further targeting.
-
TRUST BREAKS BEFORE IT BENDS: "The mechanism is technically precise and deliberately invisible." Precision means intent. Invisibility means risk. Together, they mean fingerprinting, not analytics. What LinkedIn is doing is environmental intelligence: understanding what tools you use, what signals you emit, what your digital posture reveals about your role, your employer, your vulnerabilities, and your "competitive value." When a platform with nearly a billion users can silently inventory your browser environment, the situation goes far beyond privacy and notions of consent. And it will lead to stinging discontent, among users. Think about it. This was a 6,167‑item fingerprinting operation running silently inside Chromium‑based browsers (Chrome, Edge, Brave, Opera, Arc.) The script executed in milliseconds, checked for thousands of extensions, encrypted the results, and shipped them off to LinkedIn and third‑party endpoints. All without disclosure. All mapped to real identities. Systems built to see without being seen rarely announce themselves. (They wait for us to realize the architecture was the point all along.) Since this story broke there has been no public statement from either LinkedIn or its parent co. I will continue to monitor and update. This is both important and outrageous. Under GDPR, much of this detailed data collection qualifies as Special Category Data - the kind you can’t process without explicit consent. And consent was never part of the design. https://lnkd.in/gbFvzSZh #AuguryIT #microsoft #privacyprotection
-
👉 Researchers have recently discovered a method employed by Meta (Facebook, Instagram) and Yandex to monitor Android users' browsing behaviour by exploiting a system-level loophole that bypasses users' privacy safeguards. I am not an IT specialist, but the explanation is clear enough to follow and to translate into practical terms. When a user has one of these apps installed, it opens a local communication port on the device, commonly referred to as localhost. If the app is running in the background and the user visits a website that includes tracking scripts such as Meta Pixel or Yandex Metrica (Meta Pixel alone is present on over 5.8 million websites), those scripts can detect the app and establish a direct connection to it via the localhost port. This allows the app to collect information about the website visit, potentially including session data, cookies or user identifiers. Because this communication happens locally, it is unaffected by the typical privacy measures users might rely on, such as private browsing modes, cookie deletion or the use of a VPN. This behaviour is fully documented and clearly explained at https://lnkd.in/daSMtPr3. What stands out is that this is not an exploit in the traditional sense. It simply relies on the way Android handles local network communication. It is a technically clever implementation with significant implications. It establishes a direct, hidden exchange between the browser and the app using the device's architecture without the user's awareness or control. In privacy discussions, we often refer to dark patterns, fingerprinting and similar techniques. This example introduces a different type of concern. It is not only about the method of data collection but about the extent to which the user's attempts to maintain privacy can be quietly circumvented. There is value in paying close attention to these less visible, system-level mechanisms. If this form of tracking becomes commonplace, many of the tools users rely on to safeguard their privacy, including browsers, cookie settings and VPNs, risk becoming ineffective. And consent, in such a scenario, risks becoming little more than a formality.
-
So let’s talk about the quiet middlemen nobody invited but everyone’s data somehow met anyway. Data brokers. These are the companies that scrape, buy, sell, and repackage your personal information. Then they slap it on a website and call it “people search” or “public records.” Sounds harmless. It’s not. As a Cybercrimes Detective, I can tell you exactly how scammers use these sites. ✅ They look you up by name or phone number ✅They get your current and past addresses ✅Your relatives and associates ✅Your age range, emails, sometimes employment history That’s not trivia. That’s a scam blueprint. This is how scams go from random to personal. “Hi, this is your bank.” “Hi, this is law enforcement.” “Hi, this is your grandson.” When a scammer already knows where you live, who you’re related to, and what city you’re in, the lie lands harder. Trust comes faster. Victims comply sooner. Data brokers don’t usually scam people. But they absolutely fuel scams. So what can you do. 1️⃣ Manual opt-outs Most data broker sites have opt-out pages. They’re buried. They’re annoying. And there are dozens of them. But it’s free if you have the time and patience. 2️⃣ Use removal services Companies like Incogni, DeleteMe, Optery, Aura, and others will do the legwork for you. You’re essentially paying someone to play whack-a-mole with your data year-round. For many people, that’s worth it. 3️⃣ California’s new Delete Act California now allows residents to submit a single request that requires registered data brokers to delete their personal information. This is a big step. Other states are watching closely. Bottom line. You can’t stop data collection entirely. But you can reduce your digital footprint. And every record removed is one less puzzle piece a scammer can use against you. #DataPrivacyWeek isn’t just about strong passwords. It’s about starving scammers of the information they rely on. Pause. Think. Verify. And maybe… delete yourself from the internet just a little. #FraudHero #fraud #scams #databroker #fraudprevention #PauseThinkVerify #StoptheScam
-
Cars are no longer just vehicles; they are #surveillance tools on wheels, tracking your every move. General Motors and its subsidiary OnStar recently came under fire from the Federal Trade Commission for secretly collecting and selling drivers’ precise location and behavior #data. Without clear #consent, GM monitored everything from where drivers went to how often they braked hard or drove late at night. This information was then sold to consumer reporting agencies, which used it to influence insurance rates and other financial decisions—often without drivers even realizing it. The FTC alleges that GM misled consumers through a confusing enrollment process for its OnStar Smart Driver program. Many drivers signed up under the impression they were simply accessing safety features or tools to improve driving habits, not consenting to have their every move tracked and subsequently monetized. GM’s practices exposed not just where people traveled but intimate details about their lives, inclduing visits to medical facilities. The FTC’s settlement proposed #today bans GM and OnStar from selling this data for five years and requires them to seek explicit, informed consent before collecting location and behavior data. Consumers must also be given the ability to access, delete, or limit the data collected from their vehicles. Companies view data collection as essential for innovation, but too often, it is used mainly to maximize profit at the expense of transparency and trust. Today the FTC affirms that when it comes to data collection, and really in all areas of life, if you want to be creepy, you have to ask first!
-
Booking.com, MakeMyTrip, and HotelTonight are the ultimate “champions” when it comes to data collection, an exclusive investigation has revealed. The study has examined 22 widely used hospitality and vacation planning apps, which have been downloaded by millions of users on the Google Play Store, to determine what data they access and might collect. However, surprising findings revealed that, not only do some apps fail to disclose that they collect your sensitive data, but there seems to be no legitimate reason for harvesting it, either. 14 out of 22 tested travel apps – are granted permission to access to the device’s camera to take photos, record videos, and conduct video calls. An app could potentially do this without user consent, compromising the user's privacy and security. 10 apps failed to disclose the collection of camera-related data on the Google Play Store. The ones that disclosed it said such permission was mostly needed for “app functionality” and “analytics.” Some of the apps had even more pervasive permissions. Research revealed that MakeMyTrip, a popular Indian app with over 50 million downloads for booking hotels, flights, and transport, can read entire SMS messages stored on the device... #informationsecurity #cybersecurity #infosec #cyber #applicationsecurity #privacy #datacollection #travel
-
Connected vehicles and the perils of data collection without customer consent! Consumer data has become a valuable asset across industries, including automotive. However, it is crucial to handle this data responsibly and in compliance with regulations. The case against General Motors (GM) and OnStar since last year and the recent settlement serves as a powerful reminder. GM and its subsidiary OnStar collected data on drivers' geolocation and driving behavior, including hard braking, speeding, and late-night driving, without proper consent. The Federal Trade Commission (FTC) found that GM used a misleading enrollment process to gather this data through OnStar's Smart Driver feature. This data was then sold to consumer reporting agencies, affecting insurance rates and coverage for millions of drivers. As a result, GM agreed to a settlement with the FTC, which includes a five year ban from disclosing consumers’ sensitive geolocation and driver behavior data to consumer reporting agencies. They also must take other steps to provide greater transparency and choice to consumers over the collection, use, and disclosure of their connected vehicle data. This is the FTC’s first action related to connected vehicle data. This case underscores the importance of transparency and informed consent. OEMs must prioritize ethical data practices to build trust with consumers and avoid legal repercussions: prioritize transparency, obtain explicit consent from consumers, and safeguard their data. Very recently, our team were in discussions with an auto OEM on a similar topic and how technology platforms can help in this regard. Get in touch to learn more! Bryan Wong Jonathan Osborne MBA FIP MBCS Kiran Kumar Gaddi Premanand Venkatapathy https://lnkd.in/eCUYNiby
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development