# TP4
**Repository Path**: only_zy/TP4
## Basic Information
- **Project Name**: TP4
- **Description**: https://arxiv.org/abs/2011.14522 官方代码库 仅搬运
- **Primary Language**: Unknown
- **License**: Apache-2.0
- **Default Branch**: master
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 0
- **Created**: 2021-12-16
- **Last Updated**: 2021-12-16
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
Empirical Experiments in "Feature Learning in Infinite-width Neural Networks"
========
This repo contains code to replicate our experiments (Word2Vec, MAML) in our paper
[**Feature Learning in Infinite-Width Neural Networks**](https://arxiv.org/abs/2011.14522)
*Greg Yang, Edward Hu*
In short, the code here will allow you to train feature learning infinite-width neural networks on Word2Vec and on Omniglot (via MAML).
Our results on Word2Vec:
Our Results on MAML:
Please see the README in individual folders for more details.
This is the 4th paper in the Tensor Programs series ([[0]](http://arxiv.org/abs/1902.04760)[[1]](http://arxiv.org/abs/1910.12478)[[2]](http://arxiv.org/abs/2006.14548)[[3]](http://arxiv.org/abs/2009.10685)). Also see here for code in previous papers for calculating the [GP](https://github.com/thegregyang/GP4A) and [NTK](https://github.com/thegregyang/NTK4A) limits of wide neural networks.