理工学研究科Graduate School of Science and Engineering
HUI500X3(人間情報学 / Human informatics 500)知的情報処理特論1Intelligent information processing (Ⅰ)
彌冨 仁Hitoshi IYATOMI
授業コードなどClass code etc
学部・研究科Faculty/Graduate school | 理工学研究科Graduate School of Science and Engineering |
添付ファイル名Attached documents | |
年度Year | 2023 |
授業コードClass code | YA556 |
旧授業コードPrevious Class code | |
旧科目名Previous Class title | |
開講時期Term | 春学期授業/Spring |
曜日・時限Day/Period | 金2/Fri.2 |
科目種別Class Type | |
キャンパスCampus | 小金井 |
教室名称Classroom name | 各学部・研究科等の時間割等で確認 |
配当年次Grade | |
単位数Credit(s) | |
備考(履修条件等)Notes | |
実務経験のある教員による授業科目Class taught by instructors with practical experience | |
カテゴリーCategory | 電気電子工学専攻 |
すべて開くShow all
すべて閉じるHide All
Outline (in English)
In this lecture, the state-of-the-art machine learning techniques will be introduced as an introduction, and the fundamental techniques that are important in today's machine learning, such as CNNs, RNNs, and transformers, will be taught to support these techniques.
The fundamental knowledge of linear algebra, statistics, stochastic models and their optimization, which are fundamentally necessary here, will be reviewed.
授業で使用する言語Default language used in class
英語 / English
授業の概要と目的(何を学ぶか)Outline and objectives
In this lecture, the state-of-the-art machine learning techniques will be introduced as an introduction, and the fundamental techniques that are important in today's machine learning, such as CNNs, RNNs, and transformers, will be taught to support these techniques.
The fundamental knowledge of linear algebra, statistics, stochastic models and their optimization, which are fundamentally necessary here, will be reviewed.
到達目標Goal
To acquire basic and practical knowledge of machine learning; to understand the fundamentals of today's deep learning techniques such as CNN and Transformer, and to be able to implement them to a certain degree.
この授業を履修することで学部等のディプロマポリシーに示されたどの能力を習得することができるか(該当授業科目と学位授与方針に明示された学習成果との関連)Which item of the diploma policy will be obtained by taking this class?
ディプロマポリシーのうち、「DP1」「DP2」「DP3」に関連
授業で使用する言語Default language used in class
英語 / English
授業の進め方と方法Method(s)(学期の途中で変更になる場合には、別途提示します。 /If the Method(s) is changed, we will announce the details of any changes. )
Each class usually consists of lecture, discussion and exercise.
Students are requested to do exercise in each class and some homework assignments.
アクティブラーニング(グループディスカッション、ディベート等)の実施Active learning in class (Group discussion, Debate.etc.)
あり / Yes
フィールドワーク(学外での実習等)の実施Fieldwork in class
なし / No
授業計画Schedule
授業形態/methods of teaching:対面/face to face
※各回の授業形態は予定です。教員の指示に従ってください。
1[対面/face to face]:Introduction of machine learning
What is machine learning? Definition and history. Classification and Generative models with cutting edge examples.
2[対面/face to face]:Training of neural networks
"Learning" of Back propagation neural networks (BPNN) and Convolutional neural networks (CNNs).
3[対面/face to face]:Efficient training of neural networks (1)
Weight initialization, Data pre-processing, covariate-shift, batch-normalization, regularization (revisited), dropout, hyper-parameter search
4[対面/face to face]:Efficient training of neural networks (2)
CNNs with efficient training techniques. (residual connections, squeeze-excitation, noisy-student etc.)
5[対面/face to face]:Fundamental of machine learning (1)
Review of basic probability theory for machine learning – covariance, Bayes probabilities, parameter estimation with ML estimation (curve-fitting example)
6[対面/face to face]:Fundamental of machine learning (2)
Review of basic linear algebra for dimensional reduction
7[対面/face to face]:Dimensional reduction (1)
Singular value decomposition (SVD), eigenvalue decomposition, Principal component analysis (PCA)
8[対面/face to face]:Dimensional reduction (2)
Neural network-based dimensional reduction - Autoencoders (AEs), Sparse autoencoders, and convolutional autoencoders (CAE)
9[対面/face to face]:Time-series (text) processing
Recurrent neural networks (RNNs) – LSTM and text processing
10[対面/face to face]:Transformers (1) - the new fundamental of state-of-the-art ML models
Introduction of Attention mechanism and Transformers
11[対面/face to face]:Transformers (2)
Introduction of Bi-directional encoder representations from Transformers (BERT)
12[対面/face to face]:Reinforcement learning (1)
State-value function, Bellman's equation, value iteration, Monte Carlo-based approach, Temporal Difference
13[対面/face to face]:Reinforcement learning (2)
Policy evaluation and control, on-policy and off-policy, SARSA, Q-learning, DQN (deep Q networks)
14[対面/face to face]:Wrap-up
wrap-up
授業時間外の学習(準備学習・復習・宿題等)Work to be done outside of class (preparation, etc.)
【本授業の準備・復習時間は、各4時間を標準とします。】Students should be proficient in the use of basic linear algebra and programming at least one language.
テキスト(教科書)Textbooks
No specific textbook assigned. Materials (paper, reference, book chapter, slides) will be provided from time to time.
Students will find good references by their own.
参考書References
"Pattern recognition and machine learning" C.Bishop, Springer 2006.
成績評価の方法と基準Grading criteria
60% in exercises in class and homework.
40% in final report.
(both on-line and off-line)
Use Hoppii system (Learning support system) to submit assignments and feedback.
Explanations of the assignments (except for the final assignment) will be given in the next and subsequent classes.
学生の意見等からの気づきChanges following student comments
Follow-up by native language is sometimes necessary.
学生が準備すべき機器他Equipment student needs to prepare
Personal computers.
Basically we will use Google Colaboratory in the exercise.