Artwork

Inhalt bereitgestellt von LessWrong. Alle Podcast-Inhalte, einschließlich Episoden, Grafiken und Podcast-Beschreibungen, werden direkt von LessWrong oder seinem Podcast-Plattformpartner hochgeladen und bereitgestellt. Wenn Sie glauben, dass jemand Ihr urheberrechtlich geschütztes Werk ohne Ihre Erlaubnis nutzt, können Sie dem hier beschriebenen Verfahren folgen https://de.player.fm/legal.
Player FM - Podcast-App
Gehen Sie mit der App Player FM offline!

“Condensation” by abramdemski

30:29
 
Teilen
 

Manage episode 519011554 series 3364758
Inhalt bereitgestellt von LessWrong. Alle Podcast-Inhalte, einschließlich Episoden, Grafiken und Podcast-Beschreibungen, werden direkt von LessWrong oder seinem Podcast-Plattformpartner hochgeladen und bereitgestellt. Wenn Sie glauben, dass jemand Ihr urheberrechtlich geschütztes Werk ohne Ihre Erlaubnis nutzt, können Sie dem hier beschriebenen Verfahren folgen https://de.player.fm/legal.
Condensation: a theory of concepts is a model of concept-formation by Sam Eisenstat. Its goals and methods resemble John Wentworth's natural abstractions/natural latents research.[1] Both theories seek to provide a clear picture of how to posit latent variables, such that once someone has understood the theory, they'll say "yep, I see now, that's how latent variables work!".
The goal of this post is to popularize Sam's theory and to give my own perspective on it; however, it will not be a full explanation of the math. For technical details, I suggest reading Sam's paper.
Brief Summary
Shannon's information theory focuses on the question of how to encode information when you have to encode everything. You get to design the coding scheme, but the information you'll have to encode is unknown (and you have some subjective probability distribution over what it will be). Your objective is to minimize the total expected code-length.
Algorithmic information theory similarly focuses on minimizing the total code-length, but it uses a "more objective" distribution (a universal algorithmic distribution), and a fixed coding scheme (some programming language). This allows it to talk about the minimum code-length of specific data (talking about particulars rather than average [...]
---
Outline:
(00:45) Brief Summary
(02:35) Shannons Information Theory
(07:21) Universal Codes
(11:13) Condensation
(12:52) Universal Data-Structure?
(15:30) Well-Organized Notebooks
(18:18) Random Variables
(18:54) Givens
(19:50) Underlying Space
(20:33) Latents
(21:21) Contributions
(21:39) Top
(22:24) Bottoms
(22:55) Score
(24:29) Perfect Condensation
(25:52) Interpretability Solved?
(26:38) Condensation isnt as tight an abstraction as information theory.
(27:40) Condensation isnt a very good model of cognition.
(29:46) Much work to be done!
The original text contained 15 footnotes which were omitted from this narration.
---
First published:
November 9th, 2025
Source:
https://www.lesswrong.com/posts/BstHXPgQyfeNnLjjp/condensation
---
Narrated by TYPE III AUDIO.
  continue reading

673 Episoden

Artwork
iconTeilen
 
Manage episode 519011554 series 3364758
Inhalt bereitgestellt von LessWrong. Alle Podcast-Inhalte, einschließlich Episoden, Grafiken und Podcast-Beschreibungen, werden direkt von LessWrong oder seinem Podcast-Plattformpartner hochgeladen und bereitgestellt. Wenn Sie glauben, dass jemand Ihr urheberrechtlich geschütztes Werk ohne Ihre Erlaubnis nutzt, können Sie dem hier beschriebenen Verfahren folgen https://de.player.fm/legal.
Condensation: a theory of concepts is a model of concept-formation by Sam Eisenstat. Its goals and methods resemble John Wentworth's natural abstractions/natural latents research.[1] Both theories seek to provide a clear picture of how to posit latent variables, such that once someone has understood the theory, they'll say "yep, I see now, that's how latent variables work!".
The goal of this post is to popularize Sam's theory and to give my own perspective on it; however, it will not be a full explanation of the math. For technical details, I suggest reading Sam's paper.
Brief Summary
Shannon's information theory focuses on the question of how to encode information when you have to encode everything. You get to design the coding scheme, but the information you'll have to encode is unknown (and you have some subjective probability distribution over what it will be). Your objective is to minimize the total expected code-length.
Algorithmic information theory similarly focuses on minimizing the total code-length, but it uses a "more objective" distribution (a universal algorithmic distribution), and a fixed coding scheme (some programming language). This allows it to talk about the minimum code-length of specific data (talking about particulars rather than average [...]
---
Outline:
(00:45) Brief Summary
(02:35) Shannons Information Theory
(07:21) Universal Codes
(11:13) Condensation
(12:52) Universal Data-Structure?
(15:30) Well-Organized Notebooks
(18:18) Random Variables
(18:54) Givens
(19:50) Underlying Space
(20:33) Latents
(21:21) Contributions
(21:39) Top
(22:24) Bottoms
(22:55) Score
(24:29) Perfect Condensation
(25:52) Interpretability Solved?
(26:38) Condensation isnt as tight an abstraction as information theory.
(27:40) Condensation isnt a very good model of cognition.
(29:46) Much work to be done!
The original text contained 15 footnotes which were omitted from this narration.
---
First published:
November 9th, 2025
Source:
https://www.lesswrong.com/posts/BstHXPgQyfeNnLjjp/condensation
---
Narrated by TYPE III AUDIO.
  continue reading

673 Episoden

All episodes

×
 
Loading …

Willkommen auf Player FM!

Player FM scannt gerade das Web nach Podcasts mit hoher Qualität, die du genießen kannst. Es ist die beste Podcast-App und funktioniert auf Android, iPhone und im Web. Melde dich an, um Abos geräteübergreifend zu synchronisieren.

 

Kurzanleitung

Hören Sie sich diese Show an, während Sie die Gegend erkunden
Abspielen