【Hacker News搬运】再见,感谢您一直以来的合作,Francois Chollet
-
Title: Farewell and thank you for the continued partnership, Francois Chollet
再见,感谢您一直以来的合作,Francois Chollet
Text:
Url: https://developers.googleblog.com/en/farewell-and-thank-you-for-the-continued-partnership-francois-chollet/
由于我无法直接访问网页内容,我将根据您提供的链接和描述来模拟分析抓取的内容并进行总结。 链接指向的是 Google 开发者博客上的一篇文章,标题为 "Farewell and Thank You for the Continued Partnership - François Chollet"。以下是文章的模拟总结: --- **文章标题**: Farewell and Thank You for the Continued Partnership - François Chollet **摘要**: 本文是 TensorFlow 创始人 François Chollet 对其与 Google 合作关系的告别致辞。他在文章中回顾了与 Google 的合作历程,并表达了对未来发展的期望和感谢。 **详细内容**: 1. **合作历程回顾**:Chollet 在文章中回顾了他在 Google 的工作经历,包括参与 TensorFlow 的开发以及与 Google 团队共同推进人工智能技术的发展。 2. **个人成长与收获**:Chollet 分享了在 Google 期间的个人成长和收获,包括与技术专家的交流、参与大型项目的机会以及对未来技术的见解。 3. **对 TensorFlow 的贡献**:他强调了 TensorFlow 在人工智能领域的贡献,并提到 TensorFlow 已经成为全球范围内最受欢迎的机器学习框架之一。 4. **告别与感谢**:Chollet 表达了对 Google 团队的感激之情,感谢他们在他职业生涯中的支持与帮助。 5. **未来展望**:尽管即将离开 Google,Chollet 对未来人工智能技术的发展充满期待,并希望继续在人工智能领域做出贡献。 **总结**: François Chollet 的这篇文章是对其与 Google 合作关系的告别,同时也是对 TensorFlow 和人工智能领域发展的一个回顾和展望。他的离职虽然令人遗憾,但也体现了他在人工智能领域的专业贡献和对未来技术的热情。 --- 请注意,以上内容是基于标题和描述的模拟总结,实际文章内容可能有所不同。如果您需要完整的翻译,请提供文章的具体内容。
Post by: xnx
Comments:
osm3000: I loved Keras at the beginning of my PhD, 2017. But it was just the wrong abstraction: too easy to start with, too difficult to create custom things (e.g., custom loss function).<p>I really tried to understand TensorFlow, I managed to make a for-loop in a week. Nested for-loop proved to be impossible.<p>PyTorch was just perfect out of the box. I don't think I would have finished my PhD in time if it wasn't for PyTorch.<p>I loved Keras. It was an important milestone, and it made me believe deep learning is feasible. It was just...not the final thing.
osm3000: 2017年,我开始攻读博士学位时就爱上了Keras。但这只是错误的抽象:太容易开始,太难创建自定义的东西(例如自定义损失函数)<p> 我真的试着理解TensorFlow,我设法在一周内制作了一个for循环。嵌套for循环被证明是不可能的<p> PyTorch开箱即用,非常完美。我不知道;如果不是这样,我想我不会及时完成博士学位的;t代表PyTorch<p> 我爱克拉斯。这是一个重要的里程碑,它让我相信深度学习是可行的。只是。。。不是最后一件事。
tadeegan: I guess they realized muilti-backend keras is futile? I never liked the tf.keras apis and the docs always promosed multi backend but then I guess they were never able to deliver that without breaking keras 3 changes. And even now.... "Keras 3 includes a brand new distribution API, the keras.distribution namespace, currently implemented for the JAX backend (coming soon to the TensorFlow and PyTorch backends)". I don't believe it. They are too different to reconcile under 1 api. And even if you could, I dont really see the benefit. Torch and Flax have similar goals to Keras and are imo better.
tadeegan: 我猜他们意识到多后端keras是徒劳的?我从来都不喜欢tf.keras api,文档总是提示多后端,但我想他们永远无法在不破坏keras 3更改的情况下实现这一点。即使现在&“;Keras 3包括一个全新的发行版API,Keras.distribution名称空间,目前为JAX后端实现(即将在TensorFlow和PyTorch后端实现)”;。我不知道;我不相信。它们太不一样了,无法在一个api下调和。即使你能,我也看不出有什么好处。Torch和Flax的目标与Keras相似,而且比Keras更好。
minimaxir: Genuine question: who is using Keras in production nowadays? I've done a few work projects in Keras/TensorFlow over the years and it created a lot of technical debt and lost time debugging it, with said issues disappearing once I switched to PyTorch.<p>The training loop with Keras for simple model is indeed easier and faster than PyTorch oriented helpers (e.g. Lightning AI, Hugging Face accelerate) but much, much less flexible.
minimaxir: 真正的问题是:现在谁在生产中使用Keras?我;我在Keras做了一些工作项目;多年来,TensorFlow产生了大量的技术债务,并浪费了调试时间,一旦我切换到PyTorch,上述问题就消失了<p> 使用Keras进行简单模型的训练循环确实比面向PyTorch的助手(例如Lightning AI、Hugging Face加速)更容易、更快,但灵活性要低得多。
geor9e: If I were to speculate, I would guess he quit Google. 2 days ago, his $1+ million Artificial General Intelligence competition ended. Chollet is now judging the submissions and will announce the winners in a few weeks. The timing there can't be a coincidence.
geor9e: 如果我猜测的话,我猜他会退出谷歌。2天前,他价值100多万美元的通用人工智能竞赛结束。Chollet现在正在评判参赛作品,并将在几周内宣布获奖者。那里的时机可以;这不是巧合。
bearcollision: I've always wondered how fchollet had authority to force keras into TF...<p><a href="https://github.com/tensorflow/community/pull/24">https://github.com/tensorflow/community/pull/24</a>
bearcollision: 我;我一直想知道fchollet是如何有权强制keras进入TF的…<p><a href=“https:#x2F;#x2F github.com#x2F tensorflow#x2F社区#x2F拉取24”>https:/;github.com;tensorflow;社区/;pull;24</a>