理工学研究科Graduate School of Science and Engineering
HUI500X3(人間情報学 / Human informatics 500)知的情報処理特論1Intelligent information processing (Ⅰ)
彌冨 仁Hitoshi IYATOMI
授業コードなどClass code etc
学部・研究科Faculty/Graduate school | 理工学研究科Graduate School of Science and Engineering |
添付ファイル名Attached documents | |
年度Year | 2022 |
授業コードClass code | YB016 |
旧授業コードPrevious Class code | |
旧科目名Previous Class title | |
開講時期Term | 春学期授業/Spring |
曜日・時限Day/Period | 金2/Fri.2 |
科目種別Class Type | |
キャンパスCampus | 小金井 |
教室名称Classroom name | 各学部・研究科等の時間割等で確認 |
配当年次Grade | |
単位数Credit(s) | 2 |
備考(履修条件等)Notes | |
実務経験のある教員による授業科目Class taught by instructors with practical experience | |
カテゴリーCategory | 応用情報工学専攻 |
すべて開くShow all
すべて閉じるHide All
Outline (in English)
This course firstly introduces “deep learning” techniques and then covers their supporting fundamental such as linear algebra, statistics, probabilistic model and their optimization.
Objective of this course is to obtain important aspects of machine learning techniques and their relationship to the state-of-the art artificial intelligence.
授業で使用する言語Default language used in class
英語 / English
授業の概要と目的(何を学ぶか)Outline and objectives
This course firstly introduces “deep learning” techniques and then covers their supporting fundamental such as linear algebra, statistics, probabilistic model and their optimization.
Objective of this course is to obtain important aspects of machine learning techniques and their relationship to the state-of-the art artificial intelligence.
到達目標Goal
Develop a fundamental and practical knowledge of machine learning; understanding of discriminative (non-parametric) and generative (parametric) models. This makes it possible for students to understand state-of-the-art papers in this field.
この授業を履修することで学部等のディプロマポリシーに示されたどの能力を習得することができるか(該当授業科目と学位授与方針に明示された学習成果との関連)Which item of the diploma policy will be obtained by taking this class?
ディプロマポリシーのうち、「DP1」「DP2」「DP3」に関連
授業で使用する言語Default language used in class
英語 / English
授業の進め方と方法Method(s)(学期の途中で変更になる場合には、別途提示します。 /If the Method(s) is changed, we will announce the details of any changes. )
Each class usually consists of lecture, discussion and exercise.
Students are requested to do exercise in each class and some homework assignments.
アクティブラーニング(グループディスカッション、ディベート等)の実施Active learning in class (Group discussion, Debate.etc.)
あり / Yes
フィールドワーク(学外での実習等)の実施Fieldwork in class
なし / No
授業計画Schedule
授業形態/methods of teaching:対面/face to face
※各回の授業形態は予定です。教員の指示に従ってください。
1[対面/face to face]:Introduction of machine learning
What is machine learning? Definition and history. Classification model and Generative model
2[対面/face to face]:Introduction of deep convolutional neural networks
Introduction of Back propagation neural networks (BPNN), Convolutional neural networks (CNNs), object detection models, transformer, BERT, and GAN models.
3[対面/face to face]:Fundamental of machine learning (1)
Overlearning and how to deal with it – regularizers (L1, L2, elastic net)
4[対面/face to face]:Fundamental of machine learning (2)
Basic of probability theory – covariance, Bayes probabilities, parameter estimation with ML estimation (curve-fitting example)
5[対面/face to face]:Fundamental of machine learning (3)
Back propagation (gradient descent), non-linear activation functions, objective functions (e.g. softmax, cross-entropy)
6[対面/face to face]:Convolutional neural networks (CNN)
Detail of CNN, deep learning framework [with exercises]
7[対面/face to face]:For effective learning (1)
Weight initialization, Data pre-processing, covariate-shift, batch-normalization, regularization (revisited), dropout, hyper-parameter search
8[対面/face to face]:For effective learning (2)
Transfer learning and Fine tuning, evaluation criteria
9[対面/face to face]:Time-series processing
Recurrent neural networks (RNNs) – LSTM and text processing
10[対面/face to face]:Low dimensional representation (1)
Singular value decomposition (SVD), eigenvalue decomposition, Principal component analysis (PCA)
11[対面/face to face]:Low dimensional representation (2)
Linear discriminant analysis (LDA) and Kernel PCA
12[対面/face to face]:Low dimensional representation (3)
Neural network-based dimensional reduction - Autoencoders (AEs), Sparse autoencoders, and convolutional autoencoders (CAE)
13[対面/face to face]:Important techniques in modern deep networks
Residual blocks, channel-wise, point-wise convolution, etc.
14[対面/face to face]:Wrap-up
wrap-up
授業時間外の学習(準備学習・復習・宿題等)Work to be done outside of class (preparation, etc.)
【本授業の準備・復習時間は、各4時間を標準とします。】Students should be proficient in the use of basic linear algebra and programming at least one language.
テキスト(教科書)Textbooks
No specific textbook assigned. Materials (paper, reference, book chapter, slides) will be provided from time to time.
Students will find good references by their own.
参考書References
"Pattern recognition and machine learning" C.Bishop, Springer 2006.
成績評価の方法と基準Grading criteria
60% in exercises in class and homework.
40% in final report.
(both on-line and off-line)
Use Hoppii system (Learning support system) to submit assignments and feedback.
Explanations of the assignments (except for the final assignment) will be given in the next and subsequent classes.
学生の意見等からの気づきChanges following student comments
Follow-up by native language is sometimes necessary.
学生が準備すべき機器他Equipment student needs to prepare
Personal computers.
Basically we will use Matlab or Google Colaboratory in the exercise.
その他の重要事項Others
If the course is offered online, changes in online class methods, plans, and grading methods will be presented on a case-by-case basis in the Learning support system (Hoppii). Please check carefully on a daily basis to see if your instructor contacts you via the learning support system.