Sitemap

A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.

Pages

. Welcome to Taka Yamakoshi's website!
paper
. Archive Layout with Content
paper
. Posts by Category
paper
. Posts by Collection
paper
. Page not in menu
paper
. Talks and presentations
paper
. Terms and Privacy Policy
paper
. Jupyter notebook markdown generator
paper

Posts

. Future Blog Post
paper
. Blog Post number 4
paper
. Blog Post number 3
paper
. Blog Post number 2
paper
. Blog Post number 1
paper

portfolio

. Portfolio item number 1
paper
. Portfolio item number 2
paper

projects

. Demo: Probing BERT’s priors with serial reproduction chains
paper
. Demo: WordPiece Explorer
paper
. Demo: Causal interventions expose implicit situation models for commonsense language understanding
paper

publications

2020

RD Hawkins*, T Yamakoshi*, TL Griffiths, AE Goldberg. Investigating representations of verb bias in neural language models. the Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
paper code

2022

T Yamakoshi, TL Griffiths, RD Hawkins. Probing BERT’s priors with serial reproduction chains. the Findings of Association for Computational Linguistics (ACL)
paper code demo

2023

T Yamakoshi, JL McClelland, AE Goldberg, RD Hawkins. Causal interventions expose implicit situation models for commonsense language understanding. the Findings of Association for Computational Linguistics (ACL)
paper code demo

2024

S Kumar*, TR Sumers*, T Yamakoshi, A Goldstein, U Hasson, KA Norman, TL Griffiths, RD Hawkins, SA Nastase. Shared functional specialization in transformer-based language models and the human brain. Nature Communications
paper code

talks

. Talk 1 on Relevant Topic in Your Field
paper
. Tutorial 1 on Relevant Topic in Your Field
paper
. Talk 2 on Relevant Topic in Your Field
paper
. Conference Proceeding talk 3 on Relevant Topic in Your Field
paper

teaching

. Teaching experience 1
paper
. Teaching experience 2
paper