<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0"><channel><title><![CDATA[The Performance Age: Explainers ]]></title><description><![CDATA[Technical writing that demystifies AI, algorithms, systems, and tools—through the lens of clarity, meaning, and design]]></description><link>https://theperformanceage.com/s/explainers</link><generator>Substack</generator><lastBuildDate>Tue, 07 Apr 2026 22:42:22 GMT</lastBuildDate><atom:link href="https://theperformanceage.com/feed" rel="self" type="application/rss+xml"/><copyright><![CDATA[Joshua Hathcock]]></copyright><language><![CDATA[en]]></language><webMaster><![CDATA[theperformanceage@substack.com]]></webMaster><itunes:owner><itunes:email><![CDATA[theperformanceage@substack.com]]></itunes:email><itunes:name><![CDATA[Joshua Hathcock]]></itunes:name></itunes:owner><itunes:author><![CDATA[Joshua Hathcock]]></itunes:author><googleplay:owner><![CDATA[theperformanceage@substack.com]]></googleplay:owner><googleplay:email><![CDATA[theperformanceage@substack.com]]></googleplay:email><googleplay:author><![CDATA[Joshua Hathcock]]></googleplay:author><itunes:block><![CDATA[Yes]]></itunes:block><item><title><![CDATA[Parsing Perception]]></title><description><![CDATA[How Language Models Translate Text into Thought]]></description><link>https://theperformanceage.com/p/how-language-models-see-you</link><guid isPermaLink="false">https://theperformanceage.com/p/how-language-models-see-you</guid><dc:creator><![CDATA[Joshua Hathcock]]></dc:creator><pubDate>Tue, 22 Apr 2025 17:10:52 GMT</pubDate><enclosure url="https://substackcdn.com/image/fetch/$s_!QQs7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<div class="captioned-image-container"><figure><a class="image-link image2 is-viewable-img" target="_blank" href="https://substackcdn.com/image/fetch/$s_!QQs7!,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg" data-component-name="Image2ToDOM"><div class="image2-inset"><picture><source type="image/webp" srcset="https://substackcdn.com/image/fetch/$s_!QQs7!,w_424,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg 424w, https://substackcdn.com/image/fetch/$s_!QQs7!,w_848,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg 848w, https://substackcdn.com/image/fetch/$s_!QQs7!,w_1272,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!QQs7!,w_1456,c_limit,f_webp,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg 1456w" sizes="100vw"><img src="https://substackcdn.com/image/fetch/$s_!QQs7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg" width="1456" height="819" data-attrs="{&quot;src&quot;:&quot;https://substack-post-media.s3.amazonaws.com/public/images/3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg&quot;,&quot;srcNoWatermark&quot;:null,&quot;fullscreen&quot;:null,&quot;imageSize&quot;:null,&quot;height&quot;:819,&quot;width&quot;:1456,&quot;resizeWidth&quot;:null,&quot;bytes&quot;:331845,&quot;alt&quot;:null,&quot;title&quot;:null,&quot;type&quot;:&quot;image/jpeg&quot;,&quot;href&quot;:null,&quot;belowTheFold&quot;:false,&quot;topImage&quot;:true,&quot;internalRedirect&quot;:&quot;https://theperformanceage.substack.com/i/161898805?img=https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg&quot;,&quot;isProcessing&quot;:false,&quot;align&quot;:null,&quot;offset&quot;:false}" class="sizing-normal" alt="" srcset="https://substackcdn.com/image/fetch/$s_!QQs7!,w_424,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg 424w, https://substackcdn.com/image/fetch/$s_!QQs7!,w_848,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg 848w, https://substackcdn.com/image/fetch/$s_!QQs7!,w_1272,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg 1272w, https://substackcdn.com/image/fetch/$s_!QQs7!,w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F3cf166e8-f8f2-4269-98ca-a76e55ddec73_3840x2160.jpeg 1456w" sizes="100vw" fetchpriority="high"></picture><div class="image-link-expand"><div class="pencraft pc-display-flex pc-gap-8 pc-reset"><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container restack-image"><svg role="img" width="20" height="20" viewBox="0 0 20 20" fill="none" stroke-width="1.5" stroke="var(--color-fg-primary)" stroke-linecap="round" stroke-linejoin="round" xmlns="http://www.w3.org/2000/svg"><g><title></title><path d="M2.53001 7.81595C3.49179 4.73911 6.43281 2.5 9.91173 2.5C13.1684 2.5 15.9537 4.46214 17.0852 7.23684L17.6179 8.67647M17.6179 8.67647L18.5002 4.26471M17.6179 8.67647L13.6473 6.91176M17.4995 12.1841C16.5378 15.2609 13.5967 17.5 10.1178 17.5C6.86118 17.5 4.07589 15.5379 2.94432 12.7632L2.41165 11.3235M2.41165 11.3235L1.5293 15.7353M2.41165 11.3235L6.38224 13.0882"></path></g></svg></button><button tabindex="0" type="button" class="pencraft pc-reset pencraft icon-container view-image"><svg xmlns="http://www.w3.org/2000/svg" width="20" height="20" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="lucide lucide-maximize2 lucide-maximize-2"><polyline points="15 3 21 3 21 9"></polyline><polyline points="9 21 3 21 3 15"></polyline><line x1="21" x2="14" y1="3" y2="10"></line><line x1="3" x2="10" y1="21" y2="14"></line></svg></button></div></div></div></a></figure></div><p>&#128073; <strong>Try the interactive tool now:</strong> <a href="https://tokenizer-machine.streamlit.app/">Tokenizer &#8594;</a></p><p>In the age of ChatGPT, Claude, and other AI language systems, we often interact with these tools as if they understand us. We type something in, something intelligent comes out, and we move on. But there's a profound gap between how these systems process language and how humans do. This gap reveals much about both artificial intelligence and ourselves.</p><p>Today, I'm sharing Tokenizer, a project that lifts the veil on how language models actually process our words.</p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://theperformanceage.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://theperformanceage.com/subscribe?"><span>Subscribe now</span></a></p><h2>What Happens When AI Reads Your Words?</h2><p>When you type a sentence into a language model, four key transformations happen:</p><h3>1. Tokenization: Breaking Language into Fragments</h3><p>Language models don't process whole sentences. Instead, they slice your text into "tokens"&#8212;words, parts of words, or even individual characters. Each token gets mapped to a specific numeric ID from the model's vocabulary.</p><p>For example, the phrase "The young student didn't submit the final report on time" gets broken into tokens like "The", "young", "student", "didn", "'t", "submit"... each with its own ID number. This is the first abstraction away from human language.</p><p>For GPT models, common words might be single tokens, while rare words get split into multiple subword tokens. This affects how the model processes meaning; tokens are the fundamental units of "understanding."</p><h3>2. Part-of-Speech Tagging: Assigning Grammatical Roles</h3><p>Next, the system identifies the grammatical role of each token. Is it a noun, verb, adjective? Is it the subject of the sentence or an object?</p><p>The system maps out what linguists call dependency structure, which shows how words relate to each other in a sentence. It extracts subjects, verbs, and objects, creating a structured representation of who did what to whom.</p><p>In our example, tools like spaCy would identify "student" as a noun and the subject, "submit" as the main verb, and "report" as the direct object. These relationships form the skeleton of meaning.</p><h3>3. Embedding: Converting Words to Vectors</h3><p>Here's where things get fascinating. Each token gets transformed into a vector, which is a list of hundreds of numbers that capture its meaning and context. In the tool, we use BERT's 768-dimensional embeddings and visualize them in 2D through Principal Component Analysis (PCA).</p><p>Words with similar meanings, contexts, or functions cluster together in this mathematical space. "Dog" and "cat" would be closer to each other than either would be to "algorithm." This "distributional semantics" approach is the foundation of how language models simulate understanding.</p><p>The embedding space is where a model's "knowledge" lives, not as facts, but as geometric relationships between points in this high-dimensional space.</p><h3>4. Dependency Parsing: Mapping Relationships</h3><p>Finally, the system constructs a tree of relationships between words. This visualization shows how modifiers, subjects, objects, and clauses connect to form the complete meaning of your sentence.</p><p>These trees reveal the hierarchical structure of language: which words modify which others, how clauses nest within each other, and how the overall meaning is constructed from individual components.</p><h2>Why This Matters (Beyond the Technical Details)</h2><p>These technical steps reveal something deeper: Language models don't understand language the way humans do. They simulate it convincingly, but fundamentally differently.</p><p>When you or I say "dog", we might recall the feeling of fur, the sound of barking, even emotional responses. But when a model sees "dog", it sees a vector of numbers, shaped by how often "dog" appears near words like "bark," "tail," or "vet."</p><p>That's not wrong. It's statistical meaning. But it's also disembodied, ungrounded, and unaware.</p><h2>So What?</h2><p>Language models don't have beliefs or goals; they just predict what's likely to come next.</p><ul><li><p>Their understanding of "truth" is co-occurrence-based, not experiential.</p></li><li><p>The ambiguity humans process instinctively must be explicitly encoded.</p></li></ul><p>And yet: these systems now write our resumes, filter our content, and decide what's visible or valuable. The difference between performance and understanding is no longer philosophical trivia. It's infrastructure.</p><h2>Try It Yourself &amp; The Performance Age</h2><p>Explore how these transformations work in real-time using the Tokenizer, an interactive visualization tool that reveals how AI parses and embeds your words.</p><p>This is part of a broader exploration I'm calling <em>The Performance Age</em>&#8212;investigating how truth, perception, and performance shift in the algorithmic era.</p><h2>A System's Perspective</h2><p>This tool shows how algorithmic systems convert rich, ambiguous language into structured data. Each transformation is a lossy process. We gain computational tractability, but sacrifice nuance. As AI systems increasingly mediate our lives, we must ask:</p><p>What knowledge is amplified? What subtlety is erased?</p><p>How do these algorithmic lenses shape the reality we perceive and the decisions we make?</p><div><hr></div><p><em>Have thoughts about this project? Drop them in the comments below, or reach out directly.</em></p><p class="button-wrapper" data-attrs="{&quot;url&quot;:&quot;https://theperformanceage.com/subscribe?&quot;,&quot;text&quot;:&quot;Subscribe now&quot;,&quot;action&quot;:null,&quot;class&quot;:null}" data-component-name="ButtonCreateButton"><a class="button primary" href="https://theperformanceage.com/subscribe?"><span>Subscribe now</span></a></p>]]></content:encoded></item></channel></rss>