和大师的们的思想碰撞 登录 注册
加入支持让我们有继续维护的动力!会员畅享查看所有预告 立即购买

On the Compressive Power of Autoencoders With Linear Threshold Activation Functions and ReLU Activation Functions


来源:
学校官网

收录时间:
2025-11-20 13:48:47

时间:
2025-11-27 09:00:00

地点:
津南校区

报告人:
Tatsuya Akutsu

学校:
-/-

关键词:
autoencoders, compressive power, linear threshold, ReLU, neural networks, deep learning

简介:
This talk discusses the compressive power of autoencoders using linear threshold activation functions and ReLU activation functions, exploring their theoretical properties and potential implications in deep learning models.

-/- 16
报告介绍:
Autoencoders are a type of neural network used for unsupervised learning, particularly in dimensionality reduction and feature learning. This lecture will examine the compressive capabilities of autoencoders when equipped with two types of activation functions: linear threshold units and rectified linear units (ReLU). The discussion will include theoretical analysis on how these activation functions influence the representation capacity and compression efficiency of autoencoders, providing insights into their performance in practical applications.
报告人介绍:
Tatsuya Akutsu is a researcher affiliated with the Artificial Intelligence Institute, specializing in theoretical aspects of neural networks and bioinformatics.

购买下会员支持下吧...用爱发电已经很久了 立即购买

更多讲座报告

邮件提醒 短信提醒

本文节选自学校官网,仅提供聚合查看,所有立场、观点等不代表本站立场。