65;6203;1c # Course Information
Course: TThu 15:00-16:20, Frey Hall 326
Instructor: Jeffrey Heinz, jeffrey.heinz@stonybrook.edu
Office Hours: M 13:00-14:00, W 11:00-13:00, SBS N237
Materials
Course Log
Dec 1
29 Nov 2022
22 Nov 2022
17 Nov 2022
15 Nov 2022
10 Nov 2022
- Yola presents on Rawski and Heinz’s response to Pater.
- Gillian presents on Dunbar’s response to Pater.
- For Tuesday please read Nowak et al. 2002 (Logan presenting)
08 Nov 2022
- Rita presents on Pater 2019
- For those interested, here is Kodner and Khalifa 2022 which discusses the past tense debate in the context of the 2022 SIGMORPHON shared task on modeling inflection.
03 Nov 2022
01 Nov 2022
- Jack presented on on extracting automata from recurrent neural networks (Weiss et al. 2018).
- For next Tuesday, please read the following (Rita, Yola, and Gillian presenting next Tue and Thu)
- In case you are interested, here is the whole issue
27 Oct 2022
- Han presented on string extension learning (Heinz 2010). Slides are here.
- For those interested, this 2012 paper presents a deeper analysis of the lattice structure of string-extension learnable hypothesis spaces, and shows that many other classes of languages languages that are identifiable in the limit (including the zero reversible languages) have this lattice structure. It also characterizes which of these classes are also PAC learnable.
- For Tuesday, please read Weiss et al. 2018 on extracting automata from recurrent neural networks Weiss et al. 2018 (Jack presenting).
25 Oct 2022
- Nick presented on distributional learning (Clark and Eyraud 2007). The slides are here.
- For Thu Oct 27 read string extension learning Heinz 2010 (Han presenting)
- For those interested:
- Here is (Clark 2013) which builds on the prior work of the subsitutable language learning algorithm to remove much (all?) of the ambiguity in the derivations. Consequently, syntactic structures are learned. For example, he basically recovers the syntactic structure underlying propositional logic. And here is Clark and Yoshinaka’s 2016 review of several extensions to the distributional learning framework since 2007.
- Here are slides from my talk on investigating the generalization powers of NNs on regular languages at the “All Things Language and Computation” series last Friday.
20 Oct 2022
18 Oct 2022
- We continued our study of identification in the limit paradigms.
- Reminder that project proposals are to be approved by November 1. Please talk to me if you need help coming up with a good project.
- Here is the schedule we developed over the next several weeks.
13 Oct 2022
- Eric presented and led discussion on Heinz 2010 “Learning Long-distance Phonotactics”.
- Jeff finished going over the last several slides from his what does learning mean talk, which emphasized how deterministic finite state machines define a parameterized concept class, and whether the concepts are formal (boolean) languages, stochastic languages, or string-to-string functions – so whether the parameter values are boolean, reals, or strings – the learning of these classes is conducted similarly. Papers exist which make these connections explicit for DFA more generally and k-SL and k-SP classes in particular.
- For next Tuesday, review the first 10 pages (up to 4.7) of the notes on identification in the limit. (We have already discussed up to 4.5 on Oct 06).
- Also for those who are enrolled and have not yet presented yet, please think about which papers you would be interested in presenting over the next few weeks as I would like to schedule. Feel free to get in touch if you need pointers.
11 Oct 2022
06 Oct 2022
04 Oct 2022
29 Sep 2022
27 Sep 2022
22 Sep 2022
- Class cancelled. Try to check out the learning talks on Friday at this workshop
20 Sep 2022
- Magda presented on Angluin and Laird’s 1988 paper “Learning from Noisy Examples.”
15 Sep 2022
- John presents on the VC dimension. (Slides)
13 Sep 2022
- We went over this handout that reviewed the proof that the tightest-fit rectangle algorithm pac-learns the class of axis-aligned rectangles and the proof that the elimination algoritm pac-learns the class of monomials.
- For Thursday, read sections 1, 2 and 3 of chapter 3 KV94 on the VC dimension.
08 Sep 2022
- We reviewed the preliminary definition of PAC learnability.
- We discussed and explained the PAC learnability of axis-aligned rectangles.
- We discussed the modified definition of PAC learnability which takes into account the size of the representation of the concepts.
- We established a plan for the next several classes
06 Sep 2022
- We went over HW03.
- We discussed up to 5.7 in Valiant 2013.
- We defined PAC learning formally.
01 Sep 2022
- We finished the handout on enumerability and computability.
- We discussed the first part of chapter 5 of Valiant 2013 “Probably Approximately Correct”.
- For next time
01 Sep 2022
- We finished the handout on enumerability and computability.
- We discussed the first part of chapter 5 of Valiant 2013 “Probably Approximately Correct”.
- For next time
30 Aug 2022
- We discussed chapter 3 of Valiant 2013.
- We explained enumerability and why there are too few grammars.
- For next time
25 Aug 2022
23 Aug 2022