深度学习
2025-01-25 人工神经网络 (ANN)
2025-01-25 循环神经网络 (RNN)
2025-01-25 注意力机制 (Attention Mechanisms)
2025-01-25 迁移学习 (Transfer Learning)
2025-01-25 Dropout
2025-01-25 激活函数 (Activation Function)
2025-01-25 嵌入层 (Embedding Layer)
2025-01-25 多层感知器 (MLP)
2025-01-25 反向传播 (Backpropagation)
2025-01-25 超参数 (Hyperparameters)
2025-01-25 Softmax 函数 (Softmax Function)
2025-01-25 长短期记忆网络 (LSTM)
2025-01-25 梯度消失问题 (Vanishing Gradient Problem)
2025-01-25 批量大小 (Batch Size)
2025-01-25 卷积神经网络 (CNN)
2025-01-25 深度学习 (Deep Learning)
2025-01-25 批量训练 (Batch Training)
2025-01-25 随机梯度下降 (SGD)
2025-01-25 激活图 (Activation Maps)
2025-01-25 胶囊网络 (CapsNets)
2025-01-25 注意力层 (Attention Layers)
2025-01-25 跳跃连接 (Skip Connections)
2025-01-25 Triplet 损失 (Triplet Loss)
2025-01-25 交叉熵损失 (Cross-Entropy Loss)
2025-01-25 序列建模 (Sequence Modeling)
2025-01-25 空间变换网络 (Spatial Transformer Networks)
2025-01-25 教师强制 (Teacher Forcing)
2025-01-25 特征提取 (Feature Extraction)
2025-01-25 量化 (Quantization)
2025-01-25 自注意力 (Self-Attention)
2025-01-25 梯度下降 (Gradient Descent)
2025-01-25 强化学习 (Reinforcement Learning)
2025-01-25 经验回放 (Experience Replay)
2025-01-25 课程学习 (Curriculum Learning)
2025-01-25 模型剪枝 (Model Pruning)
2025-01-25 损失函数 (Loss Function)
2025-01-25 持续学习 (Continuous Learning)
2025-01-25 灾难性遗忘 (Catastrophic Forgetting)
2025-01-25 分布外检测 (Out-of-Distribution Detection)
2025-01-25 卷积 (Convolution)
2025-01-25 池化 (Pooling)
2025-01-25 空洞卷积 (Dilated Convolutions)
2025-01-25 少思考,多成就:在不牺牲准确性的前提下,推理成本降低 50%
2025-01-25 100 个深度学习术语解释
2024-07-27 对抗样本 (Adversarial Examples)
2024-07-26 张量 (Tensor)
2024-07-03 Transformer 模型 (Transformer Models)
2023-11-20 梯度爆炸问题 (Exploding Gradient Problem)