Artwork

المحتوى المقدم من High Monkey. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة High Monkey أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.
Player FM - تطبيق بودكاست
انتقل إلى وضع عدم الاتصال باستخدام تطبيق Player FM !

S3E2 - Intentional AI: Maximizing AI for research & analysis

22:07
 
مشاركة
 

Manage episode 513584392 series 2422944
المحتوى المقدم من High Monkey. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة High Monkey أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.

In Episode 2 of the Intentional AI series, Cole and Virgil dive into the first real stage of the content lifecycle: research and analysis.

From brainstorming ideas to verifying data sources, AI is being used everywhere in the early stages of content creation. But how much of that information can you actually trust? In this episode, the team unpacks where AI helps, where it hurts, and why you still need to be the researcher of the research.

In this episode, they explore:

  • How AI fits into the research and analysis stage of the content lifecycle
  • The major risks of using AI for research, including accuracy, bias, and misinformation
  • Why trust, verification, and validation are now part of your job
  • Security and legal concerns around AI scraping and data usage
  • How different tools handle citations, transparency, and usability
  • Why you can’t skip the human role in confirming, editing, and contextualizing AI outputs

This episode also features the first step in a real experiment: researching a blog topic on digital accessibility using the tools Perplexity, ChatGPT, and Copilot. The results of that research will directly fuel the next episode on content creation.

A downloadable Episode Companion Guide is available below. It includes key episode takeaways, tool comparisons, and practical guidance on how to use AI responsibly during the research stage.


DS-S3-E2-CompanionDoc.pdf


Upcoming episodes in the Intentional AI series:

  • Oct 28, 2025 — Content Creation
  • Nov 11, 2025 — Content Management
  • Dec 2, 2025 — Accessibility
  • Dec 16, 2025 — SEO / AEO / GEO
  • Jan 6, 2026 — Content Personalization
  • Jan 20, 2026 — Front End Development & Wireframing
  • Feb 3, 2026 — Design & Media
  • Feb 17, 2026 — Back End Development
  • Mar 3, 2026 — Conversational Search (with special guest!)
  • Mar 17, 2026 — Chatbots & Agentic AI
  • Mar 31, 2026 — Series Finale & Tool Review

Whether you’re a marketer, strategist, or developer, this conversation is about making AI adoption intentional and keeping your critical thinking sharp.


New episodes every other Tuesday.


For more conversations about AI, digital strategy, and all the ways we get it wrong (and how to get it right), visit www.discussingstupid.com and subscribe on your favorite podcast platform.


(0:00) - Intro

(1:44) - Better research with AI

(3:46) - Risk: Trust & reliability

(5:29) - Risk: Security/legal concerns

(7:04) - Risk: Hallucinations

(9:17) - We tested 3 tools for AI research

(11:03) - Testing Perplexity

(14:38) - Testing ChatGPT

(17:45) - Testing Copilot

(19:54) - Comparing the tools and key takeaways

(20:52) - Outro


Subscribe for email updates on our website:

https://www.discussingstupid.com/

Watch us on YouTube:

https://www.youtube.com/@discussingstupid

Listen on Apple Podcasts, Spotify, or Soundcloud:

https://podcasts.apple.com/us/podcast/discussing-stupid-a-byte-sized-podcast-on-stupid-ux/id1428145024

https://open.spotify.com/show/0c47grVFmXk1cco63QioHp?si=87dbb37a4ca441c0

https://soundcloud.com/discussing-stupid

Check Us Out on Socials:

https://www.linkedin.com/company/discussing-stupid

https://www.instagram.com/discussingstupid/

https://www.facebook.com/discussingstupid

https://x.com/DiscussStupid

  continue reading

31 حلقات

Artwork
iconمشاركة
 
Manage episode 513584392 series 2422944
المحتوى المقدم من High Monkey. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة High Monkey أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.

In Episode 2 of the Intentional AI series, Cole and Virgil dive into the first real stage of the content lifecycle: research and analysis.

From brainstorming ideas to verifying data sources, AI is being used everywhere in the early stages of content creation. But how much of that information can you actually trust? In this episode, the team unpacks where AI helps, where it hurts, and why you still need to be the researcher of the research.

In this episode, they explore:

  • How AI fits into the research and analysis stage of the content lifecycle
  • The major risks of using AI for research, including accuracy, bias, and misinformation
  • Why trust, verification, and validation are now part of your job
  • Security and legal concerns around AI scraping and data usage
  • How different tools handle citations, transparency, and usability
  • Why you can’t skip the human role in confirming, editing, and contextualizing AI outputs

This episode also features the first step in a real experiment: researching a blog topic on digital accessibility using the tools Perplexity, ChatGPT, and Copilot. The results of that research will directly fuel the next episode on content creation.

A downloadable Episode Companion Guide is available below. It includes key episode takeaways, tool comparisons, and practical guidance on how to use AI responsibly during the research stage.


DS-S3-E2-CompanionDoc.pdf


Upcoming episodes in the Intentional AI series:

  • Oct 28, 2025 — Content Creation
  • Nov 11, 2025 — Content Management
  • Dec 2, 2025 — Accessibility
  • Dec 16, 2025 — SEO / AEO / GEO
  • Jan 6, 2026 — Content Personalization
  • Jan 20, 2026 — Front End Development & Wireframing
  • Feb 3, 2026 — Design & Media
  • Feb 17, 2026 — Back End Development
  • Mar 3, 2026 — Conversational Search (with special guest!)
  • Mar 17, 2026 — Chatbots & Agentic AI
  • Mar 31, 2026 — Series Finale & Tool Review

Whether you’re a marketer, strategist, or developer, this conversation is about making AI adoption intentional and keeping your critical thinking sharp.


New episodes every other Tuesday.


For more conversations about AI, digital strategy, and all the ways we get it wrong (and how to get it right), visit www.discussingstupid.com and subscribe on your favorite podcast platform.


(0:00) - Intro

(1:44) - Better research with AI

(3:46) - Risk: Trust & reliability

(5:29) - Risk: Security/legal concerns

(7:04) - Risk: Hallucinations

(9:17) - We tested 3 tools for AI research

(11:03) - Testing Perplexity

(14:38) - Testing ChatGPT

(17:45) - Testing Copilot

(19:54) - Comparing the tools and key takeaways

(20:52) - Outro


Subscribe for email updates on our website:

https://www.discussingstupid.com/

Watch us on YouTube:

https://www.youtube.com/@discussingstupid

Listen on Apple Podcasts, Spotify, or Soundcloud:

https://podcasts.apple.com/us/podcast/discussing-stupid-a-byte-sized-podcast-on-stupid-ux/id1428145024

https://open.spotify.com/show/0c47grVFmXk1cco63QioHp?si=87dbb37a4ca441c0

https://soundcloud.com/discussing-stupid

Check Us Out on Socials:

https://www.linkedin.com/company/discussing-stupid

https://www.instagram.com/discussingstupid/

https://www.facebook.com/discussingstupid

https://x.com/DiscussStupid

  continue reading

31 حلقات

كل الحلقات

×
 
Loading …

مرحبًا بك في مشغل أف ام!

يقوم برنامج مشغل أف أم بمسح الويب للحصول على بودكاست عالية الجودة لتستمتع بها الآن. إنه أفضل تطبيق بودكاست ويعمل على أجهزة اندرويد والأيفون والويب. قم بالتسجيل لمزامنة الاشتراكات عبر الأجهزة.

 

دليل مرجعي سريع

حقوق الطبع والنشر 2025 | سياسة الخصوصية | شروط الخدمة | | حقوق النشر
استمع إلى هذا العرض أثناء الاستكشاف
تشغيل