【Hacker News搬运】贝叶斯神经网络
-
Title: Bayesian Neural Networks
贝叶斯神经网络
Text:
Url: https://www.cs.toronto.edu/~duvenaud/distill_bayes_net/public/
很抱歉,由于我无法直接访问互联网,我无法使用JinaReader或任何其他工具来抓取或分析您提供的链接内容。不过,我可以指导您如何使用JinaReader(如果这是一个可用的工具)来抓取和分析网页内容。 以下是一个基本的步骤指南,说明如何使用JinaReader或其他类似的工具来抓取网页内容并进行分析: 1. **安装JinaReader**: 如果JinaReader是一个可安装的软件或库,您需要先安装它。通常,这可以通过使用pip(Python的包管理器)来完成: ```bash pip install jinareader
-
导入JinaReader:
在Python脚本中导入JinaReader库:from jinareader import JinaReader
-
初始化JinaReader:
创建一个JinaReader的实例,并指定要抓取的URL:url = "https://www.cs.toronto.edu/~duvenaud/distill_bayes_net/public/" reader = JinaReader(url)
-
抓取内容:
使用JinaReader的API来抓取网页内容:content = reader.fetch_content()
-
分析内容:
分析抓取的内容,这可能包括提取文本、图像、链接等:text_content = content.text # 获取网页文本内容
-
翻译非中文内容:
如果内容不是中文,您可以使用一个翻译API(如Google翻译API)来将内容翻译成中文。以下是一个示例代码片段,使用googletrans
库进行翻译(假设已经安装了该库):from googletrans import Translator translator = Translator() translated_text = translator.translate(text_content, src='auto', dest='zh-cn').text
-
总结内容:
最后,您可以编写一个函数来总结翻译后的内容,或者使用自然语言处理(NLP)库来提取摘要:def summarize_content(text): # 这里可以是自定义的总结逻辑,或者使用NLP库 return text[:100] + "..." # 示例:返回文本的前100个字符 summary = summarize_content(translated_text) print(summary)
请注意,以上代码是假设性的,因为JinaReader的具体实现可能与此不同,而且需要根据实际情况调整。如果您需要具体实现细节,请提供更多的上下文或使用文档说明。
## Post by: reqo ### Comments: **dccsillag**: Bayesian Neural Networks just seem like a failed approach, unfortunately. For one, Bayesian inference and UQ fundamentally depends on the choice of the prior, but this is rarely discussed in the Bayesian NN literature and practice, and is further compounded by how fundamentally hard to interpret and choose these priors are (what is the intuition behind a NN's parameters?). Add to that the fact that the Bayesian inference is very much approximate, and you should see the trouble.<p>If you want UQ, 'frequentist nonparametric' approaches like Conformal Prediction and Calibration/Multi-Calibration methods seem to work quite well (especilly when combined with the standard ML machinery of taking a log-likelihood as your loss), and do not suffer from any of the issues above while also giving you formal guarantees of correctness. They are a strict improvement over Bayesian NNs, IMO. > **dccsillag**: 不幸的是,贝叶斯神经网络似乎是一种失败的方法。首先,贝叶斯推理和UQ从根本上取决于先验的选择,但这在贝叶斯神经网络文献和实践中很少被讨论,并且由于这些先验从根本上很难解释和选择(神经网络参数背后的直觉是什么?)而进一步加剧。再加上贝叶斯推理非常近似的事实,你应该看到问题所在<p> 如果你想要UQ;非参数频率论;诸如共形预测和校准之类的方法;多校准方法似乎效果很好(特别是与将对数可能性作为损失的标准机器学习机制结合使用时),并且不会出现上述任何问题,同时也为您提供了正确性的正式保证。IMO表示,它们是对贝叶斯神经网络的严格改进。 **duvenaud**: Author here! What a surprise. This was an abandoned project from 2019, that we never linked or advertised anywhere as far as I know. Anyways, happy to answer questions. > **duvenaud**: 作者在这里!真是个惊喜。据我所知,这是一个2019年废弃的项目,我们从未在任何地方链接或做广告。不管怎样,很乐意回答问题。 **datastoat**: I like Bayesian inference for few-parameter models where I have solid grounds for choosing my priors. For neural networks, I like to ask people "what's your prior for ReLU versus LeakyReLU versus sigmoid?" and I've never gotten a convincing answer. > **datastoat**: 我喜欢对少数参数模型进行贝叶斯推理,因为我有充分的理由选择先验。对于神经网络,我喜欢问人们";什么;你之前对ReLU、LeakyReLU和sigmoid的看法是什么&“;而我;我从来没有得到一个令人信服的答案。 ****: > ****: **sideshowb**: I like Bayes, but I thought the "surprising" result is that double descent is supposed to prevent nns from overfitting? > **sideshowb**: 我喜欢贝叶斯,但我认为;令人惊讶";结果是,双下降应该防止nns过拟合?
-