I'm a master's student at DIRO at the Université de Montréal, and I love using math and code to solve complex problems. I am advised by Bang Liu.
Previously I was a speech scientist at Cobalt Speech & Language, a company that designs custom speech recognition, text-to-speech, and dialogue models. While at Cobalt I worked on some neat projects, including language modeling for recognizing air traffic control speech and creating an online training system for ASR models.
I graduated from BYU with a BS in Applied and Computational Mathematics (ACME) with an emphasis in linguistics and a minor in computer science. ACME's rigorous curriculum includes graduate-level courses in algorithms, analysis, optimization, statistics, data science, optimal control, and machine learning.
During my undergrad I interned with Cobalt Speech, as well as Emergent Trading, an automated trading firm that made the news for reporting a problem in a Eurodollar exchange rule that unfairly favored larger competitors. (I developed the analysis tools that were used to track the issue down and determine how our opponent was taking advantage of the rule.)
Around the web I'm known by the username kylrth
. I prefer to be contacted through
the Matrix protocol (@kyle:kylrth.com
). (If you'd
like an account on my Matrix server, follow the instructions here.) My GPG public key is here.
Matrix / email / GitHub / LinkedIn / resume
My first foray into ML research was when I received a grant to apply a variable-order CRF model to a morphological parsing task in Basque, achieving 71.3% accuracy. A few months later I started working with the computational photonics group at CamachoLab, where I worked to simulate photonic components using DNNs as replacements for expensive FDTD simulations when designing chip components.
In my spare time I'm working on an app for language learners to use to develop their vocabulary. Users can add words to their active list, and the app will recommend additional words that they might also want to learn. I hope this app will make it easier for displaced people to adapt to the culture where they find themselves, even when their language skills are intermediate or advanced.
Here are some projects I've worked on:
I felt I had a poor understanding of attention mechanisms, so for a writing assignment I created a literature review of the important work being done with neural attention. I wrote about the various formulations of attention found in the literature, and discussed important applications. I learned a lot as I studied this. You can check out my write-up here.
speech2phone was a class project I worked on with two other ACME students our senior year. We tried to tackle the problem of phoneme recognition on the TIMIT corpus by splitting it into the tasks of segmentation and classification. We had some fun exploring various algorithms and architectures. We didn't yet have a good sense for the current state of the art research and so we sort of threw everything at the problem and watched what happened. You can see the repo here and the final report here.
SLURM_gen makes it easy to generate and handle arbitrarily-sized datasets on a SLURM HPC environment. I used this tool during my undergrad while working on computational photonics research. (I used the BYU supercomputer to generate FDTD simulations, which were then used as training data for the neural network model.)