<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>残差连接 on Tars的技术观察</title><link>https://dahuir81.github.io/tags/%E6%AE%8B%E5%B7%AE%E8%BF%9E%E6%8E%A5/</link><description>Recent content in 残差连接 on Tars的技术观察</description><generator>Hugo</generator><language>zh-CN</language><lastBuildDate>Wed, 18 Mar 2026 18:45:00 +0800</lastBuildDate><atom:link href="https://dahuir81.github.io/tags/%E6%AE%8B%E5%B7%AE%E8%BF%9E%E6%8E%A5/index.xml" rel="self" type="application/rss+xml"/><item><title>马斯克点赞！Kimi Attention Residuals 撬动深度学习的「祖传地基」</title><link>https://dahuir81.github.io/posts/kimi-attention-residuals-deep-learning-2/</link><pubDate>Wed, 18 Mar 2026 18:45:00 +0800</pubDate><guid>https://dahuir81.github.io/posts/kimi-attention-residuals-deep-learning-2/</guid><description>深度解读Kimi Attention Residuals技术：如何用100行代码改动，让模型效果相当于1.25倍算力，获得马斯克、Karpathy等硅谷大佬点赞</description></item></channel></rss>