
My broad research interests are currently focused on the following topics:
 Spectral algorithms: algorithms that involve singular value decomposition (SVD) or similar factorization techniques
 Representation learning: inducing useful transformations of data and using them to augment supervised learning
 Structured prediction: computation with structured objects (e.g., sequences, trees)
I find these topics most interesting when they are combined together. I also value strong empirical performance in realworld problems.
More romantically, I hold inherent interests in the prospect that human intelligence (and "heart") can be computationally created.
Publications

Karl Stratos, Michael Collins, and Daniel Hsu.
A New Understanding of Word Embeddings Based on Canonical Correlation Analysis on Transformed Word Counts.
In Proceedings of ACL (2015).

YoungBum Kim, Karl Stratos, Ruhi Sarikaya, and Minwoo Jeong.
New Transfer Learning Techniques for Disparate Label Sets.
In Proceedings of ACL (2015).

Karl Stratos and Michael Collins.
Simple SemiSupervised POS Tagging [pdf] [code] [word representations].
In Proceedings of NAACL Workshop on Vector Space Modeling for NLP (2015).

YoungBum Kim, Minwoo Jeong, Karl Stratos, and Ruhi Sarikaya.
Weakly Supervised Slot Tagging with Partially Labeled Sequences from Web Search Click Logs [pdf].
In Proceedings of NAACL (2015).

YoungHoon Jung, Karl Stratos, and Luca P. Carloni.
LNAnnote: An Alternative Approach to Information Extraction from Emails using LocallyCustomized NamedEntity Recognition [pdf].
In Proceedings of WWW (2015).

Karl Stratos, Dokyum Kim, Michael Collins, and Daniel Hsu.
A Spectral Algorithm for Learning ClassBased ngram Models of Natural Language. [pdf]
In Proceedings of UAI (2014).
 Here's a longer version that has an appendix on sample complexity.
 Here's code cca I used for deriving word embeddings as described in this paper.
 Here's code greedo for greedy agglomerative clustering of word vectors.

Shay B. Cohen, Karl Stratos, Michael Collins, Dean P. Foster and Lyle Ungar.
Spectral Learning of LatentVariable PCFGs: Algorithms and Sample Complexity [pdf].
In JMLR (2014).

Karl Stratos, Alexander M. Rush, Shay B. Cohen, and Michael Collins.
Spectral Learning of Refinement HMMs [pdf][slides].
In Proceedings of CoNLL (2013).

Shay B. Cohen, Karl Stratos, Michael Collins, Dean Foster and Lyle Ungar.
Experiments with Spectral Learning of LatentVariable PCFGs [pdf].
In Proceedings of NAACL (2013).

Shay B. Cohen, Karl Stratos, Michael Collins, Dean Foster and Lyle Ungar.
Spectral Learning of LatentVariable PCFGs [pdf][slides].
In Proceedings of ACL (2012).

Margaret Mitchell, Xufeng Han, Jesse Dodge, Alyssa Mensch, Amit Goyal, Alex Berg, Kota Yamaguchi, Tamara Berg, Karl Stratos, and Hal Daumé III
Midge: Generating Image Descriptions From Computer Vision Detections
[pdf].
In Proceedings of EACL (2012).

Alexander C. Berg, Tamara L. Berg, Hal Daumé III,
Jesse Dodge, Amit Goyal, Xufeng Han, Alyssa Mensch,
Margaret Mitchell, Aneesh Sood, Karl Stratos, Kota Yamaguchi
Understanding and Predicting Importance in Images
[pdf].
In Proceedings of CVPR (2012).

Karl Stratos, Lenhart K. Schubert, and Jonathan Gordon.
Episodic Logic: Natural Logic + Reasoning [pdf].
In Proceedings of KEOD (2011).

Lenhart Schubert, Jonathan Gordon, Karl Stratos, and Adina Rubinoff.
Towards Adequate Knowledge and Natural Inference [pdf].
In Proceedings of AAAI Fall Symposium on Advances in Cognitive Systems (2011).
Activities
2014 Summer: I had the benefit of interning with Slav Petrov and Emily Pitler
at Google New York.
2013 Summer: I had the privilege of interning with Sham Kakade and T. J. Hazen
at Microsoft Research New England.
2013 Summer: We gave a tutorial on spectral methods in NLP at NAACL 2013! Video [Part 1] [Part 2]
2011 Summer: I participated in the
summer workshop at JHU CLSP
as part of the "vision" team. I worked mainly on analyzing and modeling what people choose to describe in an image.
20102011: At URCS, I was part of the
KNEXT project, a joint endeavor of
Len Schubert,
Jonathan Gordon, and
Benjamin Van Durme (among others) to extract
commonsense knowledge out of textual resources.
Informal Writings
These are just for fun.

Projections onto linear subspaces [pdf].

A minimalist's exposition of EM [pdf].

Notes on the framework of Ando and Zhang (2005) [pdf].

Max margin training ("support vector machines") [pdf].

Shiftreduce dependency parsing [pdf].

Approximate CCA [pdf].

A Hitchhiker's Guide to PCA and CCA [pdf].

The Lorentz transformation [pdf].

A formulation of the EM algorithm for PCFGs [pdf].

A formulation of the EM algorithm for HMMs [pdf].

