Artwork

المحتوى المقدم من LessWrong. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة LessWrong أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.
Player FM - تطبيق بودكاست
انتقل إلى وضع عدم الاتصال باستخدام تطبيق Player FM !

“Condensation” by abramdemski

30:29
 
مشاركة
 

Manage episode 519021309 series 3364760
المحتوى المقدم من LessWrong. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة LessWrong أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.
Condensation: a theory of concepts is a model of concept-formation by Sam Eisenstat. Its goals and methods resemble John Wentworth's natural abstractions/natural latents research.[1] Both theories seek to provide a clear picture of how to posit latent variables, such that once someone has understood the theory, they'll say "yep, I see now, that's how latent variables work!".
The goal of this post is to popularize Sam's theory and to give my own perspective on it; however, it will not be a full explanation of the math. For technical details, I suggest reading Sam's paper.
Brief Summary
Shannon's information theory focuses on the question of how to encode information when you have to encode everything. You get to design the coding scheme, but the information you'll have to encode is unknown (and you have some subjective probability distribution over what it will be). Your objective is to minimize the total expected code-length.
Algorithmic information theory similarly focuses on minimizing the total code-length, but it uses a "more objective" distribution (a universal algorithmic distribution), and a fixed coding scheme (some programming language). This allows it to talk about the minimum code-length of specific data (talking about particulars rather than average [...]
---
Outline:
(00:45) Brief Summary
(02:35) Shannons Information Theory
(07:21) Universal Codes
(11:13) Condensation
(12:52) Universal Data-Structure?
(15:30) Well-Organized Notebooks
(18:18) Random Variables
(18:54) Givens
(19:50) Underlying Space
(20:33) Latents
(21:21) Contributions
(21:39) Top
(22:24) Bottoms
(22:55) Score
(24:29) Perfect Condensation
(25:52) Interpretability Solved?
(26:38) Condensation isnt as tight an abstraction as information theory.
(27:40) Condensation isnt a very good model of cognition.
(29:46) Much work to be done!
The original text contained 15 footnotes which were omitted from this narration.
---
First published:
November 9th, 2025
Source:
https://www.lesswrong.com/posts/BstHXPgQyfeNnLjjp/condensation
---
Narrated by TYPE III AUDIO.
  continue reading

665 حلقات

Artwork
iconمشاركة
 
Manage episode 519021309 series 3364760
المحتوى المقدم من LessWrong. يتم تحميل جميع محتويات البودكاست بما في ذلك الحلقات والرسومات وأوصاف البودكاست وتقديمها مباشرة بواسطة LessWrong أو شريك منصة البودكاست الخاص بهم. إذا كنت تعتقد أن شخصًا ما يستخدم عملك المحمي بحقوق الطبع والنشر دون إذنك، فيمكنك اتباع العملية الموضحة هنا https://ar.player.fm/legal.
Condensation: a theory of concepts is a model of concept-formation by Sam Eisenstat. Its goals and methods resemble John Wentworth's natural abstractions/natural latents research.[1] Both theories seek to provide a clear picture of how to posit latent variables, such that once someone has understood the theory, they'll say "yep, I see now, that's how latent variables work!".
The goal of this post is to popularize Sam's theory and to give my own perspective on it; however, it will not be a full explanation of the math. For technical details, I suggest reading Sam's paper.
Brief Summary
Shannon's information theory focuses on the question of how to encode information when you have to encode everything. You get to design the coding scheme, but the information you'll have to encode is unknown (and you have some subjective probability distribution over what it will be). Your objective is to minimize the total expected code-length.
Algorithmic information theory similarly focuses on minimizing the total code-length, but it uses a "more objective" distribution (a universal algorithmic distribution), and a fixed coding scheme (some programming language). This allows it to talk about the minimum code-length of specific data (talking about particulars rather than average [...]
---
Outline:
(00:45) Brief Summary
(02:35) Shannons Information Theory
(07:21) Universal Codes
(11:13) Condensation
(12:52) Universal Data-Structure?
(15:30) Well-Organized Notebooks
(18:18) Random Variables
(18:54) Givens
(19:50) Underlying Space
(20:33) Latents
(21:21) Contributions
(21:39) Top
(22:24) Bottoms
(22:55) Score
(24:29) Perfect Condensation
(25:52) Interpretability Solved?
(26:38) Condensation isnt as tight an abstraction as information theory.
(27:40) Condensation isnt a very good model of cognition.
(29:46) Much work to be done!
The original text contained 15 footnotes which were omitted from this narration.
---
First published:
November 9th, 2025
Source:
https://www.lesswrong.com/posts/BstHXPgQyfeNnLjjp/condensation
---
Narrated by TYPE III AUDIO.
  continue reading

665 حلقات

Alle Folgen

×
 
Loading …

مرحبًا بك في مشغل أف ام!

يقوم برنامج مشغل أف أم بمسح الويب للحصول على بودكاست عالية الجودة لتستمتع بها الآن. إنه أفضل تطبيق بودكاست ويعمل على أجهزة اندرويد والأيفون والويب. قم بالتسجيل لمزامنة الاشتراكات عبر الأجهزة.

 

دليل مرجعي سريع

حقوق الطبع والنشر 2025 | سياسة الخصوصية | شروط الخدمة | | حقوق النشر
استمع إلى هذا العرض أثناء الاستكشاف
تشغيل