Grip Security’s Post

View organization page for Grip Security, graphic

11,778 followers

🚨 PSA: 🚨 LinkedIn automatically opted everyone in to allowing LinkedIn and its affiliates to use your personal data and the content you post to train its generative AI models. To disable this feature, each user must go into their profile and manually turn it off. (Settings>Data Privacy>>Data for Generative AI Improvement)  This situation highlights some key security concerns, particularly around AI tools being rolled out at lightning speed. It’s crucial to know what AI tools are in use within your organization, how they interact with your systems. And it’s not just new AI tools you need to monitor—sanctioned apps are quickly adding #GenAI features, and you’ll want visibility into those changes. Is employee usage growing? Does the app still align with your security policies? Are stronger authentication controls needed?  Without visibility into shadow SaaS and shadow AI, you also risk #compliance violations. Take this LinkedIn example—the UK ICO found LinkedIn was using member data without explicit consent, a #GDPR requirement.  Similarly, other compliance standards, like NYDFS, PCI-DSS 4.0, and HIPAA, require #MFA to access sensitive customer information. Without visibility into ALL the applications in use across your enterprise, you could find yourself in compliance hot water if shadow SaaS and shadow AI tools aren’t secured appropriately or are gathering or storing sensitive data without consent.  SaaS usage has evolved drastically. To help you stay ahead, we’ve put together tips for developing your 2025 SaaS security strategy and a best practices checklist. It's free, ungated, and ready for you to dive into. No form fills required: https://lnkd.in/ePZBqHst #SaaSSecurity #ShadowSaaS #ShadowAI #CyberSecurity #Compliance #GenAI 

  • No alternative text description for this image

To view or add a comment, sign in

Explore topics