David and Randy respond to an article that makes the case for JAX over Julia for machine learning, particularly when applied to solving differential equations.
David also shares a series of workshops hosted by the Julia Gender Inclusive community, as well as a new package by Elias Carvalho for creating truth tables from Julia expressions, and Randy explores a YouTube series and set of Pluto notebooks all about partially observable Markov processes.
Support Talk Julia on Ko-Fi
We're excited to announce that we have opened up podcast memberships! Become a member for as little as $5/mo to get early access to episodes, social media shoutouts, and, starting at $10/mo and up, access to a members-only "office hours" call.
Your support helps us continue to bring you interviews and educational Julia content each week. It also helps us grow sustainably and improve the quality of the podcast.
Become a member today 👉
Manning Publications Discount
We've partnered with Manning Publications to bring all of our listeners a special 35% discount code on all of Mannings physical books, ebooks, courses, and more. There's no expiration date and you can use the discount as many times as you like!
Just visit http://mng.bz/pOzw and use the code
podtalkjulia22 at checkout to get 35% off of your order!
- JAX vs Julia (vs PyTorch) by Patrick Kidger — A while ago there was an interesting thread on the Julia Discourse about the “state of machine learning in Julia”. I posted a response discussing the differences between Julia and Python (both JAX and PyTorch), and it seemed to be really well received!
Since then this topic seems to keep coming up, so I thought I’d tidy up that post and put it somewhere I could link to easily. Rather than telling all the people who ask for my opinion to go searching through the Julia Discourse until they find that one post… :D
To my mind JAX and Julia are unquestionably the current state-of-the-art frameworks for autodifferentiation, scientific computing, and ML computing. So let’s dig into the differences.
- Doing small network scientific machine learning in Julia 5x faster than PyTorch by Chris Elrod, Niklas Korsbo, Chris Rackauckas — Machine learning is a huge discipline, with applications ranging from natural language processing to solving partial differential equations. It is from this landscape that major frameworks such as PyTorch, TensorFlow, and Flux.jl arise and strive to be packages for "all of machine learning"... However, the ability to easily construct machine learning libraries thus presents an interesting question: can this development feature be used to easily create alternative frameworks which focus its performance on more non-traditional applications or aspects?
The answer is yes, you can quickly build machine learning implementations which greatly outperform the frameworks in specialized cases using the Julia programming language, and we demonstrate this with our new package: SimpleChains.jl.
- SimpleChains.jl (GitHub)
- Julia Gender Inclusive (Meetup)
- Julia Gender Inclusive (Twitter)
- Learn Julia With Us (YouTube Playlist)
- Decision Making Under Uncertainty using POMDPs.jl (YouTube Playlist)
- Decision Making Under Uncertainty (GitHub Repo)
- POMDPs.jl (GitHub Repo)
- TruthTables.jl (GitHub Repo)