Artwork

المحتوى المقدم من Cyber Crime Junkies. Host David Mauro.. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة Cyber Crime Junkies. Host David Mauro. أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.
Player FM - تطبيق بودكاست
انتقل إلى وضع عدم الاتصال باستخدام تطبيق Player FM !

How Deep Fake Videos Increase Security Risks.

1:15:03
 
مشاركة
 

Manage episode 410826703 series 3370503
المحتوى المقدم من Cyber Crime Junkies. Host David Mauro.. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة Cyber Crime Junkies. Host David Mauro. أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.

NEW! Text Us Direct Here!

Paul Eckloff was a US Secret Service Agent for 23 years. Today we discuss how deep fake videos increase security risks. Topics include: artificial intelligence risks in cyber, new ways to reduce risk of deep fakes, how are deep fakes made, how are deep fake videos made, how are audio deep fakes made, and how is ai making it harder to detect deep fakes.
Catch Video Episode with Sample Deep Fakes here: https://youtu.be/1yFFK6uHt0I?si=qP9F1_uIZ7q6qGSS
Takeaways

  • Sample deepfakes are played. Can you tell? Over 85% of those tested could not.
  • Deepfake technology, created using techniques like GANs, diffusion models, and VAEs, can convincingly substitute one person's face or voice with another's.
  • The advancement of deepfake technology poses risks such as impersonating executives, enhancing social engineering campaigns, avoiding detection in malware, and conducting reconnaissance for future attacks.
  • The widespread availability and low cost of deepfake technology make it accessible to both legitimate businesses and threat actors, increasing the threat surface for organizations.
  • The potential for deepfakes to manipulate and deceive individuals, especially children, is a grave concern.

Chapters

  • 00:00 Introduction to Deepfake Technology and its Impact
  • 03:27 The Challenges of Detecting Deepfakes
  • 08:04 The Erosion of Trust: Seeing is No Longer Believing
  • 11:31 The Advancement of Deepfake Technology
  • 26:53 The Malicious Uses of Deepfake Technology
  • 36:17 The Risks of Deepfake Technology
  • 37:42 Consequences of Deepfakes
  • 40:38 Limitations of Deepfake Detec

Try KiteWorks today at www.KiteWorks.com
Don't Miss our Video on this Exciting KiteWorks Offer!

Click the link above or text 904-867-4468, 2014652: and leave your message!
You can now text our Podcast Studio direct. Ask questions, suggest guests and stories.
We Look Forward To Hearing From You!
Try KiteWorks today at www.KiteWorks.com
Don't miss this Video on it!
The Most Secure Managed File Transfer System.

Custom handmade Women's Clothing, Plushies & Accessories at Blushingintrovert.com. Portions of your purchase go to Mental Health Awareness efforts.


  continue reading

191 حلقات

Artwork
iconمشاركة
 
Manage episode 410826703 series 3370503
المحتوى المقدم من Cyber Crime Junkies. Host David Mauro.. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة Cyber Crime Junkies. Host David Mauro. أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.

NEW! Text Us Direct Here!

Paul Eckloff was a US Secret Service Agent for 23 years. Today we discuss how deep fake videos increase security risks. Topics include: artificial intelligence risks in cyber, new ways to reduce risk of deep fakes, how are deep fakes made, how are deep fake videos made, how are audio deep fakes made, and how is ai making it harder to detect deep fakes.
Catch Video Episode with Sample Deep Fakes here: https://youtu.be/1yFFK6uHt0I?si=qP9F1_uIZ7q6qGSS
Takeaways

  • Sample deepfakes are played. Can you tell? Over 85% of those tested could not.
  • Deepfake technology, created using techniques like GANs, diffusion models, and VAEs, can convincingly substitute one person's face or voice with another's.
  • The advancement of deepfake technology poses risks such as impersonating executives, enhancing social engineering campaigns, avoiding detection in malware, and conducting reconnaissance for future attacks.
  • The widespread availability and low cost of deepfake technology make it accessible to both legitimate businesses and threat actors, increasing the threat surface for organizations.
  • The potential for deepfakes to manipulate and deceive individuals, especially children, is a grave concern.

Chapters

  • 00:00 Introduction to Deepfake Technology and its Impact
  • 03:27 The Challenges of Detecting Deepfakes
  • 08:04 The Erosion of Trust: Seeing is No Longer Believing
  • 11:31 The Advancement of Deepfake Technology
  • 26:53 The Malicious Uses of Deepfake Technology
  • 36:17 The Risks of Deepfake Technology
  • 37:42 Consequences of Deepfakes
  • 40:38 Limitations of Deepfake Detec

Try KiteWorks today at www.KiteWorks.com
Don't Miss our Video on this Exciting KiteWorks Offer!

Click the link above or text 904-867-4468, 2014652: and leave your message!
You can now text our Podcast Studio direct. Ask questions, suggest guests and stories.
We Look Forward To Hearing From You!
Try KiteWorks today at www.KiteWorks.com
Don't miss this Video on it!
The Most Secure Managed File Transfer System.

Custom handmade Women's Clothing, Plushies & Accessories at Blushingintrovert.com. Portions of your purchase go to Mental Health Awareness efforts.


  continue reading

191 حلقات

كل الحلقات

×
 
Loading …

مرحبًا بك في مشغل أف ام!

يقوم برنامج مشغل أف أم بمسح الويب للحصول على بودكاست عالية الجودة لتستمتع بها الآن. إنه أفضل تطبيق بودكاست ويعمل على أجهزة اندرويد والأيفون والويب. قم بالتسجيل لمزامنة الاشتراكات عبر الأجهزة.

 

دليل مرجعي سريع