WPI Worcester Polytechnic Institute

Computer Science Department
------------------------------------------

DS504/CS586 - Big Data Analytics - Spring 2024

Version:

------------------------------------------

Home Class Info Schedule Projects
Grading Reviews Resources

------------------------------------------

Tentative Schedule:

Slides will be updated on Canvas before each lecture.

+-1. Week 1 (1/16 T):

    Topic 0: Overview of Big Data Analytics
    Readings: N/A

+-2. Week 2 (1/23 T):

+-3. Week 3 (1/30 T):

    Topic 3: Big data Management.
    Reading1: Section 4.1 in [ACM TIST] Trajectory Data Mining: An Overview.(paper)
    Reading2: [ACM CIKM 2016] Sampling Big Trajectory Data. (paper)

-4. Week 4 (2/6 T):
    Topic 4: Big Graph Data Mining (Sampling Large-Scale Networks via Random Walk).
    Readings: M. Gjoka, M. Kurant, C. T. Butts, A. Markopoulou, Walking in Facebook: A Case Study of Unbiased Sampling of OSNs, INFOCOM 2010. (paper)
    Readings: Section 0 and Section 1. L. Lovasz, Random Walks on Graphs: A Survey, Combinatorics, Volume 2, 1993. (paper)

-5. Week 5 (2/13 T):
    Topic 5: Deep Learning, Why deep? Deep vs shadow neural networks.
    Note: Individual Project #1 Due: Online Sampling and Estimation.
    Note: Individual Project #2 Starts: Classification with Deep Neural Network Model.

-6. Week 6 (2/20 T):
    Topic 5: Recipe for Deep Neural Network Training.
    Readings: "Deep Learning Tuning Playbook", Varun Godbole†, George E. Dahl†, Justin Gilmer†, Christopher J. Shallue‡, Zachary Nado†, with † Google Research, Brain Team, ‡ Harvard University, Jan 2023.

-7. Week 7 (2/27 T):
    Topic 6: Generative Adversarial Networks (GANs).
    Readings: A Beginner's Guide to Generative Adversarial Networks (GANs). (Link)
    Readings: Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S.,Bengio, Y. (2014). Generative adversarial nets. In Advances in neural information processing systems (pp. 2672-2680). (paper)
    Note: Team Project Starts.

-8. Week 8 (3/5 T): Spring Break, No Class. (See this link)

    Note: Individual Project #2 Due: Classification with Deep Neural Network Model.
    Note: Individual Project #3 Starts: GAN.

-9. Week 9 (3/12 T):
    Topic 6: More on Generative Adversarial Networks (GANs). (Continued! See slides for week 7.)
    Note: Team Project Proposal Due. Submit 2-page proposal to Canvas discussion board.
-10. Week 10 (3/19 T):
    Topic 8: Meta Learning and Few Shot Learning.
    Readings: Meta Learning tutorial. (Link)
    Readings: Chelsea Finn, Pieter Abbeel, Sergey Levine, Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks. (PDF)
    Readings: Alex Nichol and Joshua Achiam and John Schulman, On First-Order Meta-Learning Algorithms. (PDF)
    Readings: [Siamese Networks] Chopra, S.; Hadsell, R.; LeCun, Y. (June 2005). "Learning a similarity metric discriminatively, with application to face verification". 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05). (PDF)
    Readings: [Prototypical Networks] Jake Snell, Kevin Swersky, Richard S. Zemel, Prototypical Networks for Few-shot Leaning. (PDF)

-11. Week 11 (3/26 T): No Class, (See this link)
-12. Week 12 (4/2 T):
    Topic 9: Adversarial Attacks/Defense.
    Readings: [FGSM] Ian J. Goodfellow, Jonathon Shlens and Christian Szegedy, EXPLAINING AND HARNESSING ADVERSARIAL EXAMPLES, ICLR 2015(PDF).
    Readings: Mahmood Sharif, Sruti Bhagavatula, Lujo Bauer, Michael K. Reiter, Accessorize to a Crime: Real and Stealthy Attacks on State-of-the-Art Face Recognition, CCS 2016(PDF).
    Note: Individual Project #3 Due: GAN.
    Note: Individual Project #4 Starts: Meta-Learning and Few Shot Learning.

-13. Week 13 (4/9 T):
    Topic 10: Explainable AI (XAI).
    Readings: XAI Tutorial in KDD 2019 conferenceLink.
    Readings: Understanding Neural Networks Through Deep Visualization. (PDF).

-14. Week 14 (4/16 T):
    Topic 11: GPT: Generative pre-trained transformer, ChatGPT.
    Readings: GPT: Improving Language Understanding by Generative Pre-Training link.
    Readings: GPT2: Language Models are Unsupervised Multitask Learners link.
    Readings: GPT3: Language Models are Few-Shot Learners link.
    Readings: GPT3.5 link.
    Readings: GPT4 link.
    Note: Team Project Progressive Report Due. Submit 5-page progressive report to Canvas discussion board.

-15. Week 15 (4/23 T):
    Topic 11: Deep Neural Network Compression and Class Review.
    Readings: Jonathan Frankle, Michael Carbin, THE LOTTERY TICKET HYPOTHESIS:FINDING SPARSE, TRAINABLE NEURAL NETWORKS. (PDF).
    Readings: Geoffrey Hinton, Oriol Vinyals, Jeff Dean, Distilling the Knowledge in a Neural Network. (PDF).
    Note: Individual Project #4 Due: Meta-Learning and Few Shot Learning.

-16. Week 16 (4/30 T):
    Team Project Presentations
    All teams.
    Note: Team Project Due. Submit your team report and individual self-and-peer evaluation form on Canvas.

To be updated.



yli15 at wpi.edu