-
| Reading Club Session 23 February 2021 by Reuben BrasherRead More
Everyone is talking about GPT-3, so, we talked about GPT-3. This seems to be the paper that started it all, “Language Models are Few-Shot Learners.”
The paper is long with 75 pages, but there are 31 authors, so there are ~2.42 pages per author. That should make it easy. Probably we will only have time to discuss sections 2 and 3 in detail.
-
| Reading Club session for 9 February 2021 by Reuben BrasherRead More
We began a new thread related to ethics in AI with papers focus on papers by Buolamwini and Gebru.
AI is now used routinely to make decisions that once were made by people in areas ranging from hiring to policing to social matchmaking. It seems fair to scrutinize these applications for ethics and fairness. Particularly sensitive applications involve biometrics and facial recognition. In the US, recent examples of federal bills and proposed legislation include Algorithmic Accountability Act, Commercial Facial Recognition Privacy Act of 2019 and No Biometric Barriers Act. All of these in some way or other propose auditing procedures for face processing technologies. Already, there have been large public audits by scholars and organizations which have affected providers of face processing APIs such as Microsoft, IBM, Amazon and smaller enterprises.
-
| Reading Club session 14 January 2021 by Reuben BrasherRead More
For this first session of 2021, we did almost a pure tutorial session. We covered BERT again, but went through the experience using TensorFlow 2.3 and Hub on a virtual machine. Along with the tutorial, we created a small repo with requirements and some instructions.
-
| Reading Club session 26 January 2021 by Reuben BrasherRead More
Google Research recently open sourced TaPas, a system for doing natural language queries on tabular data. The model is fully differentiable and based on BERT. We read the paper, “TAPAS: Weakly Supervised Table Parsing via Pre-training.” We ran the code from google research repo on a virtual machine and saw both the power of the system and some of its limitations.
-
| Reading Club session 12 November 2020 by Reuben BrasherRead More
We looked at a classical quantum-inspired algorithm by Erwin Tang in “A quantum-inspired classical algorithm for recommendation systems” and an associated follow up paper by Arrazola et alia “Quantum-inspired algorithms in practice.” This algorithm uses clever sampling techniques to approximate solutions for linear equations of the form Ax = b where x is unknown, essentially what the HHL algorithm does. Each have their strengths and weaknesses which the second paper discusses in more detail.
-
| Reading Club session 17 December 2020 by Reuben BrasherRead More
We returned to a topic for which we had previously had many sessions, that of NLP and word embeddings. We discussed “Deep contextualized word representations” by Peters et alia. We discussed the historical context of ELMo being popular as a bidirectional language model having the advantage over Word2vec of taking word order into account.
For the second half of the session, we went over a tutorial on using pretrained models from Tensor Hub, in particular the BERT language model.
-
| Reading Club session 29 October 2020 by Reuben BrasherRead More
We had taken a break from regular meetings, and when we returned, we switched gears to topics in quantum computing applied to machine learning. We discussed a few sections from “Machine learning & artificial intelligence in the quantum domain” by Vedran Dunjko and Hans J. Briegel.
-
| Reading Club session 6 August 2020 by Reuben BrasherRead More
We discussed two classic papers by Alex Graves. The oldest of these was “Generating Sequences With Recurrent Neural Networks” in which we especially focused on those parts about generating handwriting. The newer one, “DRAW: A Recurrent Neural Network For Image Generation” had several coauthors from Google DeepMind and explained how to use neural Turing machines to generate realistic images.
-
| Reading Club session for 2 July 2020 by Reuben BrasherRead More
We continued with our study of clustering methods applied to graph theory with “Clustering and Community Detection in Directed Networks: A Survey” by Fragkiskos D. Malliarosa, Michalis Vazirgiannis which describes methods for clustering graphs without discarding directional information.
-
| Reading Club session for 23 July 2020 by Reuben BrasherRead More
We wrapped up our discussion of clustering on graphs, still with a couple more sections from “Clustering and Community Detection in Directed Networks: A Survey” by Fragkiskos D. Malliarosa, Michalis Vazirgiannis.
For the second half of the session, we switched gears and discussed a comparison between polynomial regression and neural networks that seemed both theoretically important and practical from Xi Cheng et alia “Polynomial Regression as an Alternative to Neural Nets.”