
面向编程人员的实用深度学习
文章平均质量分 79
Klein&Macmillan
博主很懒,什么都不想告诉你
展开
专栏收录文章
- 默认排序
- 最新发布
- 最早发布
- 最多阅读
- 最少阅读
-
Testimonials
Get startedStart watching lesson 1 now!The Economist“This month fast.ai, an education non-profit based in San Francisco, kicked off the third year of its course in deep learning. Since its inception it has attracted more than 100,000 students, scattered ar原创 2024-06-06 08:47:26 · 909 阅读 · 0 评论 -
Kaggle
Kaggle is the world’s largest data science community. One of Kaggle’s features is “Notebooks”, which is “a cloud computational environment that enables reproducible and collaborative analysis”. In particular, Kaggle provides access to GPUs for free. Every原创 2024-06-03 08:38:51 · 980 阅读 · 0 评论 -
Forums
If you need help, there’s a wonderful online community ready to help you at forums.fast.ai. Before asking a question on the forums, search carefully to see if your question has been answered before. (The forum system won’t let you post until you’ve spent a原创 2024-06-03 08:38:10 · 466 阅读 · 0 评论 -
The book
section.原创 2024-05-31 10:06:22 · 325 阅读 · 0 评论 -
25、潜伏扩散(Latent diffusion)
Video。原创 2024-05-31 09:08:41 · 205 阅读 · 0 评论 -
24、注意力和变压器(Attention & transformers)
rearrange。原创 2024-05-29 15:51:31 · 463 阅读 · 0 评论 -
23、超分辨率(Super-resolution)
In this lesson, we work with Tiny Imagenet to create a super-resolution U-Net model, discussing dataset creation, preprocessing, and data augmentation. The goal of super-resolution is to scale up a low-resolution image to a higher resolution. We train the原创 2024-05-29 15:50:12 · 1674 阅读 · 0 评论 -
22、Karras et al (2022)
Jeremy begins this lesson with a discussion of improvements to the DDPM/DDIM implementation. He explores the removal of the concept of an integral number of steps, making the process more continuous. He then delves into predicting the amount of noise in an原创 2024-05-28 08:38:47 · 785 阅读 · 0 评论 -
21、DDIM
In this lesson, Jeremy, Johno, and Tanishq discuss their experiments with the Fashion-MNIST dataset and the CIFAR-10 dataset, a popular dataset for image classification and generative modeling. They introduce Weights and Biases (W&B), an experiment trackin原创 2024-05-28 08:37:58 · 336 阅读 · 0 评论 -
20、混合精度(Mixed Precision)
In this lesson, we dive into mixed precision training and experiment with various techniques. We introduce the MixedPrecision callback for PyTorch and explore the Accelerate library from HuggingFace for speeding up training loops. We also learn a sneaky tr原创 2024-05-27 15:40:14 · 539 阅读 · 0 评论 -
19、DDPM和辍学(DDPM and Dropout)
In this lesson, Jeremy introduces Dropout, a technique for improving model performance, and with special guests Tanishq and Johno he discusses Denoising Diffusion Probabilistic Models (DDPM), the underlying foundational approach for diffusion models. The l原创 2024-05-27 15:39:29 · 351 阅读 · 0 评论 -
18、加速SGD和ResNets(Accelerated SGD & ResNets)
In this lesson, we dive into various stochastic gradient descent (SGD) accelerated approaches, such as momentum, RMSProp, and Adam. We start by experimenting with these techniques in Microsoft Excel, creating a simple linear regression problem and applying原创 2024-05-24 15:21:55 · 381 阅读 · 0 评论 -
17、初始化/规范化(Initialization/normalization)
In this lesson, we discuss the importance of weight initialization in neural networks and explore various techniques to improve training. We start by introducing changes to the miniai library and demonstrate the use of HooksCallback and ActivationStats for原创 2024-05-24 15:21:03 · 353 阅读 · 0 评论 -
16、学习者框架(The Learner framework)
In Lesson 16, we dive into building a flexible training framework called the learner. We start with a basic callbacks Learner, which is an intermediate step towards the flexible learner. We introduce callbacks, which are functions or classes called at spec原创 2024-05-17 13:21:47 · 870 阅读 · 0 评论 -
15、自动编码器(Autoencoders)
We start with a dive into convolutional autoencoders and explore the concept of convolutions. Convolutions help neural networks understand the structure of a problem, making it easier to solve. We learn how to apply a convolution to an image using a kernel原创 2024-05-17 13:20:33 · 464 阅读 · 0 评论 -
14、反向传播(Backpropagation)
【代码】14、反向传播(Backpropagation)原创 2024-05-16 09:00:50 · 345 阅读 · 0 评论 -
13、反向传播和MLP(Backpropagation & MLP)
【代码】13、反向传播和MLP(Backpropagation & MLP)原创 2024-05-16 08:59:30 · 415 阅读 · 0 评论 -
12、均移聚类(Mean shift clustering)
In this lesson, we start by discussing the CLIP Interrogator, a Hugging Face Spaces Gradio app that generates text prompts for creating CLIP embeddings. We then dive back into matrix multiplication, using Einstein summation notation and torch.einsum to sim原创 2024-05-15 09:08:53 · 524 阅读 · 0 评论 -
11、矩阵乘法(Matrix multiplication)
In this lesson, we discuss various techniques and experiments shared by students on the forum, such as interpolating between prompts for visually appealing transitions and improving the update process in text-to-image generation, and a novel approach to de原创 2024-05-15 09:08:03 · 555 阅读 · 1 评论 -
10、深入挖掘(Diving Deeper)
【代码】10、深入挖掘(Diving Deeper)原创 2024-05-14 11:07:07 · 570 阅读 · 0 评论 -
9、稳定扩散(Stable Diffusion)
score。原创 2024-05-14 11:06:00 · 514 阅读 · 0 评论 -
摘要Summaries--课时八(Lesson 8)
Building embeddings from scratch原创 2024-05-13 13:12:02 · 1722 阅读 · 0 评论 -
摘要Summaries--课时七(Lesson 7)
Gradient accumulation and GPU memory原创 2024-05-13 13:09:34 · 1501 阅读 · 0 评论 -
摘要Summaries--课时六(Lesson 6)
【代码】摘要Summaries--课时六(Lesson 6)原创 2024-05-11 08:52:36 · 776 阅读 · 0 评论 -
摘要Summaries--课时五(Lesson 5)
Linear model and neuralnet from scratch原创 2024-05-11 08:51:38 · 700 阅读 · 0 评论 -
摘要Summaries--课时四(Lesson 4)
New and Exciting Content Why Hugging Face transformer Will we in this lecture fine-tune a pretrained NLP model with HF rather than fastai library? Why use transformer rather than fastai library? Is Jeremy in the process of integrating transformer int原创 2024-05-10 08:46:04 · 969 阅读 · 0 评论 -
摘要Summaries--课时三(Lesson 3)
Introduction and survey “Lesson 0” How to fast.ai Where is Lesson 0 video? What does it to do with the book ‘meta learning’ and fastai course? How to do a fastai lesson? Watch with note Run the notebook and experiment Reproduce the notes from t原创 2024-05-10 08:44:13 · 741 阅读 · 0 评论 -
摘要Summaries--课时二(Lesson 2)
Daniel 深度碎片。原创 2024-05-09 09:02:42 · 829 阅读 · 0 评论 -
摘要Summaries--课时一(Lesson 1)
Daniel 深度碎片。原创 2024-05-09 09:00:54 · 393 阅读 · 0 评论 -
奖励:数据伦理(Bonus: Data ethics)
Video。原创 2024-05-08 08:52:40 · 1331 阅读 · 0 评论 -
8、卷积Convolutions (CNN)
embeddingsVideoResources。原创 2024-05-08 08:51:26 · 428 阅读 · 0 评论 -
7、协同过滤(Collaborative filtering)
VideoResources。原创 2024-05-07 17:31:39 · 427 阅读 · 0 评论 -
6、随机森林(Random forests)
Video。原创 2024-05-07 17:30:50 · 343 阅读 · 0 评论 -
从头开始的模型(From-scratch model)
sigmoidmetricsVideo。原创 2024-05-06 16:42:32 · 852 阅读 · 0 评论 -
自然语言(NLP)
VideoResources。原创 2024-05-06 16:41:03 · 509 阅读 · 0 评论 -
神经网络基础(Neural net foundations)
(ReLU).Video。原创 2024-04-30 08:48:59 · 408 阅读 · 0 评论 -
部署(Deployment)
Video。原创 2024-04-30 08:47:28 · 501 阅读 · 1 评论 -
实用深度学习(Practical Deep Learning)
New!Welcome!Lessonsnot。原创 2024-04-29 09:15:36 · 1314 阅读 · 0 评论 -
入门(Getting started)
VideoResources。原创 2024-04-29 09:24:36 · 624 阅读 · 0 评论