Cover Image
An estimated 56% of employees are directly using AI to automate or augment job tasks.* Although there are many benefits that come with using these tools — employee enablement, productivity, time and resources saved — they also pose great risk to your sensitive and private data. AI platforms like ChatGPT and Google Gemini, collect the data submitted by end users to train their models. Any proprietary data submitted to these tools becomes publicly available once submitted, putting your most valuable data at risk. Join our upcoming webinar, where Tyler Croak and Tony Kelly, Lookout’s cloud security experts, will walk you through best practices on your journey to enable your workforce to use GenAI tools safely, and without risk to your organization. You’ll leave with an understanding of: - The risks of unsecured GenAI tools - Real-world examples of accidental data leaks through AI tools - Best practices to balance employee enablement with data security - An actionable 3-step plan for secure AI use in your organization *Source: UKG
Vendor:
Posted:
Apr 24, 2024
Published:
Apr 24, 2024
Format:
Type:
Replay

This resource is no longer available.