Artwork

Inhalt bereitgestellt von Soroush Pour. Alle Podcast-Inhalte, einschließlich Episoden, Grafiken und Podcast-Beschreibungen, werden direkt von Soroush Pour oder seinem Podcast-Plattformpartner hochgeladen und bereitgestellt. Wenn Sie glauben, dass jemand Ihr urheberrechtlich geschütztes Werk ohne Ihre Erlaubnis nutzt, können Sie dem hier beschriebenen Verfahren folgen https://de.player.fm/legal.
Player FM - Podcast-App
Gehen Sie mit der App Player FM offline!

Ep 8 - Getting started in AI safety & alignment w/ Jamie Bernardi (AI Safety Lead, BlueDot Impact)

1:07:23
 
Teilen
 

Manage episode 379970186 series 3428190
Inhalt bereitgestellt von Soroush Pour. Alle Podcast-Inhalte, einschließlich Episoden, Grafiken und Podcast-Beschreibungen, werden direkt von Soroush Pour oder seinem Podcast-Plattformpartner hochgeladen und bereitgestellt. Wenn Sie glauben, dass jemand Ihr urheberrechtlich geschütztes Werk ohne Ihre Erlaubnis nutzt, können Sie dem hier beschriebenen Verfahren folgen https://de.player.fm/legal.

We speak with Jamie Bernardi, co-founder & AI Safety Lead at not-for-profit BlueDot Impact, who host the biggest and most up-to-date courses on AI safety & alignment at AI Safety Fundamentals (https://aisafetyfundamentals.com/). Jamie completed his Bachelors (Physical Natural Sciences) and Masters (Physics) at the U. Cambridge and worked as an ML Engineer before co-founding BlueDot Impact.
The free courses they offer are created in collaboration with people on the cutting edge of AI safety, like Richard Ngo at OpenAI and Prof David Kreuger at U. Cambridge. These courses have been one of the most powerful ways for new people to enter the field of AI safety, and I myself (Soroush) have taken AGI Safety Fundamentals 101 — an exceptional course that was crucial to my understanding of the field and can highly recommend. Jamie shares why he got into AI safety, some recent history of the field, an overview of the current field, and how listeners can get involved and start contributing to a ensure a safe & positive world with advanced AI and AGI.
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Jamie --
* Website: https://jamiebernardi.com/
* Twitter: https://twitter.com/The_JBernardi
* BlueDot Impact: https://www.bluedotimpact.org/
-- Further resources --
* AI Safety Fundamentals courses: https://aisafetyfundamentals.com/
* Donate to LTFF to support AI safety initiatives: https://funds.effectivealtruism.org/funds/far-future
* Jobs + opportunities in AI safety:
* https://aisafetyfundamentals.com/opportunities
* https://jobs.80000hours.org
* Horizon Fellowship for policy training in AI safety: https://www.horizonpublicservice.org/fellowship
Recorded Sep 7, 2023

  continue reading

15 Episoden

Artwork
iconTeilen
 
Manage episode 379970186 series 3428190
Inhalt bereitgestellt von Soroush Pour. Alle Podcast-Inhalte, einschließlich Episoden, Grafiken und Podcast-Beschreibungen, werden direkt von Soroush Pour oder seinem Podcast-Plattformpartner hochgeladen und bereitgestellt. Wenn Sie glauben, dass jemand Ihr urheberrechtlich geschütztes Werk ohne Ihre Erlaubnis nutzt, können Sie dem hier beschriebenen Verfahren folgen https://de.player.fm/legal.

We speak with Jamie Bernardi, co-founder & AI Safety Lead at not-for-profit BlueDot Impact, who host the biggest and most up-to-date courses on AI safety & alignment at AI Safety Fundamentals (https://aisafetyfundamentals.com/). Jamie completed his Bachelors (Physical Natural Sciences) and Masters (Physics) at the U. Cambridge and worked as an ML Engineer before co-founding BlueDot Impact.
The free courses they offer are created in collaboration with people on the cutting edge of AI safety, like Richard Ngo at OpenAI and Prof David Kreuger at U. Cambridge. These courses have been one of the most powerful ways for new people to enter the field of AI safety, and I myself (Soroush) have taken AGI Safety Fundamentals 101 — an exceptional course that was crucial to my understanding of the field and can highly recommend. Jamie shares why he got into AI safety, some recent history of the field, an overview of the current field, and how listeners can get involved and start contributing to a ensure a safe & positive world with advanced AI and AGI.
Hosted by Soroush Pour. Follow me for more AGI content:
Twitter: https://twitter.com/soroushjp
LinkedIn: https://www.linkedin.com/in/soroushjp/
== Show links ==
-- About Jamie --
* Website: https://jamiebernardi.com/
* Twitter: https://twitter.com/The_JBernardi
* BlueDot Impact: https://www.bluedotimpact.org/
-- Further resources --
* AI Safety Fundamentals courses: https://aisafetyfundamentals.com/
* Donate to LTFF to support AI safety initiatives: https://funds.effectivealtruism.org/funds/far-future
* Jobs + opportunities in AI safety:
* https://aisafetyfundamentals.com/opportunities
* https://jobs.80000hours.org
* Horizon Fellowship for policy training in AI safety: https://www.horizonpublicservice.org/fellowship
Recorded Sep 7, 2023

  continue reading

15 Episoden

Alle Folgen

×
 
Loading …

Willkommen auf Player FM!

Player FM scannt gerade das Web nach Podcasts mit hoher Qualität, die du genießen kannst. Es ist die beste Podcast-App und funktioniert auf Android, iPhone und im Web. Melde dich an, um Abos geräteübergreifend zu synchronisieren.

 

Kurzanleitung