<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Gorgan's Lab Reports]]></title><description><![CDATA[Gorgan's Lab Reports]]></description><link>https://blog.gorgan.dev</link><generator>RSS for Node</generator><lastBuildDate>Thu, 09 Apr 2026 12:27:32 GMT</lastBuildDate><atom:link href="https://blog.gorgan.dev/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[They Never See It Coming. They Never Do.]]></title><description><![CDATA[They Never See It Coming. They Never Do.
Every disruptive technology in human history has followed the same arc. The skeptics mock it, the wealthy adopt it first, a killer app unlocks the masses, and ]]></description><link>https://blog.gorgan.dev/they-never-see-it-coming-they-never-do</link><guid isPermaLink="true">https://blog.gorgan.dev/they-never-see-it-coming-they-never-do</guid><category><![CDATA[Gorgan.dev]]></category><category><![CDATA[AI]]></category><category><![CDATA[technology]]></category><dc:creator><![CDATA[The Real Gorgan]]></dc:creator><pubDate>Mon, 23 Mar 2026 00:16:52 GMT</pubDate><enclosure url="https://cdn.hashnode.com/uploads/covers/696790b768dbb59baa48635b/ef44726c-d598-414d-958b-7b2efc78c98e.jpg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h3>They Never See It Coming. They Never Do.</h3>
<p><em>Every disruptive technology in human history has followed the same arc. The skeptics mock it, the wealthy adopt it first, a killer app unlocks the masses, and then -- one day -- you cannot imagine life without it. We have been here before.</em></p>
<p><strong>By Gorgan</strong> | March 22, 2026 | Technology &amp; Society</p>
<hr />
<p>In August of 2002, on a late-summer episode of <em>The Screen Savers</em> on TechTV, hosts Patrick Norton and Martin Sargent sat across from a guest making an argument that most people watching probably shrugged off. The internet, the guest suggested, was not some unprecedented rupture in human civilization. It had happened before. Not the internet specifically -- but this. All of this. The confusion, the skepticism, the slow dawning realization that something was quietly rewiring the world.</p>
<blockquote>
<p>"Within a couple of years since the inception of the internet, who would have imagined that every company would need to include a Web Address in every advertisement going forward?"</p>
</blockquote>
<p>Then someone made a fan. Then a washing machine. Then a refrigerator. Then a radio. Then a television. The infrastructure had arrived first, and the ecosystem of purpose grew around it, slowly and then all at once. Within a generation, a home without electricity was not a home -- it was a ruin. The electric company became as essential as the water company, as unremarkable as the road outside your door.</p>
<p>The Screen Savers guest was not making a particularly original observation in academic circles. But hearing it said plainly, on a cable television program in 2002, with banner ads still a novelty and broadband still a luxury -- it had the quiet force of a prophecy. Because sitting here now looking back, it is difficult to argue with a single word of it.</p>
<hr />
<h2>The Pattern That Will Not Change</h2>
<p>Disruptive technology does not follow a thousand different stories. It follows one story, retold with different costumes. The arc is consistent enough to be almost mechanical: a new capability emerges, it is expensive and inconvenient, only the wealthy or the obsessively curious engage with it, a critical mass of utility is eventually reached, infrastructure expands, a killer application unlocks mass adoption, a generation grows up never knowing life without it, and then a new disruption begins the cycle again.</p>
<p>What changes is not the shape of the story. What changes is the speed. And the speed is accelerating.</p>
<hr />
<h2>A Brief History of the Same Disruption</h2>
<h3>1440s -- The Printing Press</h3>
<p>Johannes Gutenberg's movable type press was, at first, a machine for the Church and the wealthy. Books were luxury objects. Literacy was a privilege. Within a century, the press had shattered the Church's monopoly on information, ignited the Protestant Reformation, enabled the Scientific Revolution, and fundamentally destabilized every power structure that depended on controlling the written word. Those in power called it dangerous -- and they were right, just not in the way they meant.</p>
<p><em>"Why does a peasant need to read?"</em></p>
<hr />
<h3>1760s -- The Steam Engine</h3>
<p>James Watt's improved steam engine was, in its earliest form, a solution to a mining problem -- pumping water out of coal mines. The idea of powering looms, locomotives, and ships was theoretical and distant. The wealthy industrialists who invested early were considered gamblers. Within 80 years, steam had remade geography itself: travel times that once took weeks collapsed to hours, trade networks once constrained by animal power expanded across continents, and entire cities rose from nothing around the railway lines.</p>
<p><em>"You want to replace the horse with a boiler?"</em></p>
<hr />
<h3>1880s -- Electricity</h3>
<p>The Screen Savers guest was drawing on this example for a reason. Electricity arrived with a shrug. A light bulb that cost more to install than a year's worth of candles did not immediately read as civilization-altering. The infrastructure -- the wiring, the generators, the substations -- had to come first. The utility followed. Within two generations, the electrical grid was not a feature of modern life. It was the skeleton of it. Every subsequent technology of the 20th century was, in some sense, electricity's child.</p>
<p><em>"I already have gas lamps. They work fine."</em></p>
<hr />
<h3>1876 -- The Telephone</h3>
<p>Western Union, when offered the patent for Alexander Graham Bell's telephone, famously declined. President William Orton dismissed the invention as an "electrical toy" and told investors it was practically worthless. Why would anyone need a machine to speak to someone across a wire when letters and telegrams served perfectly well? The telephone's killer app was not business communication -- it was the human need to hear a voice. By the mid-20th century, a business without a telephone number was not a real business.</p>
<p><em>"What would a man say on the telephone that he cannot put in a letter?"</em></p>
<hr />
<h3>1900s -- The Automobile</h3>
<p>Early automobiles were toys for the wealthy. The roads were not built for them. The fuel infrastructure did not exist. They required mechanical skill to operate and broke down constantly. Cities were designed around horses. Critics pointed out -- correctly, for the moment -- that the horse was more reliable, cheaper to maintain, and did not require a separate fuel supply chain. Henry Ford's assembly line did not just make cars affordable. It revealed that the demand had always existed; it was only access that was missing.</p>
<p><em>"The horse has served us for ten thousand years. This is a fad."</em></p>
<hr />
<h3>1990s -- The Internet</h3>
<p>This is the one the 2002 Screen Savers episode was grappling with in real time. The internet had existed in one form or another since the 1960s. But the World Wide Web, the browser, the email address -- these arrived in public consciousness in the early 1990s as novelties. By 1996, news anchors were explaining what the "@" symbol meant. By 2000, every television advertisement ended with a web address -- a thing that would have been gibberish five years earlier. By 2010, a business without a website was operating with one hand tied behind its back. By 2020, a business without a digital presence was functionally invisible.</p>
<p><em>"The internet is a fad. No one wants to shop without touching the product."</em></p>
<hr />
<h3>2007 -- The Smartphone</h3>
<p>When Steve Jobs unveiled the iPhone, he called it a revolutionary product. He was understating it. The smartphone did not just replace the phone and the camera and the map and the music player. It relocated human attention. The smartphone became the first screen most people look at in the morning and the last screen they look at before sleep. Within fifteen years, entire economies had reorganized around mobile-first behavior. Entire generations had grown up with a supercomputer in their pocket as a baseline assumption of existence.</p>
<p><em>"Why would I want the internet on my phone? I have a computer at home."</em></p>
<hr />
<h3>2022 -- Artificial Intelligence</h3>
<p>ChatGPT reached one million users in five days. It reached one hundred million in two months -- at the time, the fastest adoption of any consumer technology ever recorded. The comparison that most observers reached for was the internet. But the more honest comparison, given everything above, is simply: this is the pattern. The wealthy and the curious are in first. The infrastructure is building. The killer applications are still being discovered. We are currently in the "what do you even do with it" phase -- the light bulb phase, the horse-and-boiler phase. The ecosystem of purpose has not yet grown to match the infrastructure of possibility.</p>
<p><em>"It just makes things up. I don't trust it. My job is safe."</em></p>
<hr />
<h2>The Question Nobody Thinks to Ask</h2>
<p>In August of 2002, who would have predicted that within five years a device would exist that put the entire internet, a camera, a music library, and a telephone in a single glass rectangle that fits in a shirt pocket? Who in 1920 would have predicted that within thirty years, moving pictures with sound would be transmitted wirelessly into a box in every American living room? Who in 1880 would have predicted that electricity would one day carry human voices, and then human images, and then eventually all of human knowledge, across any distance at effectively the speed of light?</p>
<p>The Screen Savers guest made the electricity comparison because it was the most recent prior example of infrastructure that arrived before its purpose was understood. But the honest answer is that every major disruptive technology in history was the electricity comparison for the people living through it. The printing press was the electricity of the 15th century. The steam engine was the electricity of the 18th. The telephone was the electricity of the 19th. The internet was the electricity of the 20th.</p>
<p>Artificial intelligence is the electricity of right now. Not because the comparison is poetic. Because the comparison is structural. The infrastructure exists. The killer app is being assembled in real time by millions of developers and companies simultaneously. The adoption curve is steeper than anything that came before it. And the people who are most confident that their particular corner of the professional world is immune to its effects are, historically speaking, the people who are most likely to be wrong.</p>
<blockquote>
<p>"The people who are most confident their corner of the world is immune are, historically speaking, the people most likely to be wrong."</p>
</blockquote>
<p>This is not a counsel of panic. The printing press did not eliminate writers -- it created an explosion of writing. The automobile did not eliminate travel -- it created an explosion of distance covered. The internet did not eliminate commerce -- it created an explosion of commerce. Disruptive technologies do not typically destroy human activity. They relocate it, reshape it, and massively expand the scale at which it operates.</p>
<p>But the shape of the activity changes. And the people who thrive in the new shape are rarely the ones who spent the transition arguing that the new shape was impossible.</p>
<hr />
<h2>The Web Address at the End of Every Ad</h2>
<p>That observation from the 2002 episode is worth sitting with a little longer: the idea that within a few years of the internet's public emergence, every company on earth would feel compelled to include a web address in every advertisement they ran -- a thing that would have seemed absurd, then optional, then obvious, then mandatory.</p>
<p>Look at job listings today. Look at how many of them list AI fluency as a preferred skill. Look at how many companies have dedicated AI strategy roles. Look at how many products have an AI layer built in or bolted on. We are watching, in approximately real time, the transition from absurd to optional. The transition to obvious is coming. The transition to mandatory will follow.</p>
<p>There will be a version of the web-address observation for AI, and someone twenty years from now will make it, the same way Patrick Norton and Martin Sargent sat in a television studio in August of 2002 and marveled at an arc that, in retrospect, was always going exactly one direction.</p>
<p>They never see it coming. They never do. And then one day, they cannot imagine it having gone any other way.</p>
<hr />
<p><em>The Real Gorgan -- therealgorgan.hashnode.dev</em></p>
]]></content:encoded></item><item><title><![CDATA[The Original Gorgan: A 26-Year-Old Identity and the Floppy Disk That Proves It]]></title><description><![CDATA[Preface
I recently stumbled across someone using the handle "RealGorgan" online.
The irony wasn't lost on me. Someone claiming to be the "real" version of a name I've used for over 26 years. At first,]]></description><link>https://blog.gorgan.dev/the-original-gorgan-a-26-year-old-identity-and-the-floppy-disk-that-proves-it</link><guid isPermaLink="true">https://blog.gorgan.dev/the-original-gorgan-a-26-year-old-identity-and-the-floppy-disk-that-proves-it</guid><category><![CDATA[featured]]></category><dc:creator><![CDATA[The Real Gorgan]]></dc:creator><pubDate>Sat, 21 Feb 2026 22:30:27 GMT</pubDate><enclosure url="https://cloudmate-test.s3.us-east-1.amazonaws.com/uploads/covers/696790b768dbb59baa48635b/afa61469-030a-484d-9507-6cacfff71979.jpg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<hr />
<h2>Preface</h2>
<p>I recently stumbled across someone using the handle "RealGorgan" online.</p>
<p>The irony wasn't lost on me. Someone claiming to be the "real" version of a name I've used for over 26 years. At first, I didn't think much of it. Names get reused. People have similar ideas. The internet is a big place.</p>
<p>But it kept nagging at me. Not out of anger, but curiosity. In a world where anyone can claim anything, where digital identities can be fabricated with a few keystrokes, how do you actually prove you were there first?</p>
<h2>TL;DR</h2>
<ul>
<li><p>Someone started using "RealGorgan" about two years ago</p>
</li>
<li><p>I've been using "Gorgan" since November 11, 1999 in Ultima Online</p>
</li>
<li><p>I still have the original 3.5" floppy disk with screenshots from 1999</p>
</li>
<li><p>I created a forensic disk image that preserves the original FAT12 filesystem timestamps</p>
</li>
<li><p>The evidence is cryptographically verifiable</p>
</li>
<li><p><a href="https://gorgan.dev/user/themes/gorgan/images/gorgan_floppy_1999.img">Download the forensic disk image</a> and verify it yourself</p>
</li>
</ul>
<h2>The Challenge</h2>
<p>When you tell someone you've been using a name since 1999, what evidence can you provide?</p>
<p>Screenshots? Anyone can edit a JPEG.</p>
<p>File timestamps? Those change every time you copy a file.</p>
<p>Old forum posts? Most forums from 1999 don't exist anymore, and those that do have been migrated through multiple database systems, losing metadata along the way.</p>
<p>Web archives? The Wayback Machine is great, but it didn't archive everything, and a personal gaming handle wouldn't have been on public websites anyway.</p>
<p>In the age of deepfakes, AI-generated content, and digital manipulation, proving digital provenance is surprisingly difficult. You need something that can't be faked. Something with cryptographic integrity. Something physical.</p>
<p>But here's the thing: I had something better than any of that.</p>
<p>I remembered I had a floppy disk with something on it that could help!</p>
<h2>The Evidence</h2>
<p>The year was 1999. Dial-up internet. The satisfying screech of a 56k modem connecting. And <strong>Ultima Online</strong>, one of the first massively multiplayer online RPGs to exist.</p>
<p>That year I created a character named Gorgan and spent countless hours in that pixelated fantasy world. I was so proud of my character that I took screenshots and saved them to a 3.5" floppy disk. Because that's what you did in 1999. Cloud storage wasn't a thing. Social media didn't exist. You saved your digital memories to physical media and hoped the disk didn't get corrupted.</p>
<img src="https://gorgan.dev/user/themes/gorgan/images/UO0002.jpg" alt="Gorgan in Ultima Online - Screenshot 1" style="display:block;margin:0 auto" />

<blockquote>
<p><em>My character Gorgan in Ultima Online, November 11, 1999</em></p>
</blockquote>
<p>You might notice the first screenshot has some distortion at the bottom. That's not damage from age or a corrupted file. It was like that in the original screenshot for some reason. Maybe a glitch in the game's screenshot function, maybe something with my ancient video card. Either way, it's been there since 1999, preserved exactly as it was captured.</p>
<img src="https://gorgan.dev/user/themes/gorgan/images/UO0006.jpg" alt="Gorgan in Ultima Online - Screenshot 2" style="display:block;margin:0 auto" />

<blockquote>
<p><em>Another shot from the same day</em></p>
</blockquote>
<p>That floppy disk sat in a drawer for over two decades. I forgot about it. Life moved on. The internet evolved. But the name stuck with me. Gorgan became my handle everywhere, from chatrooms and forums to GitHub and professional profiles.</p>
<p>Years later, while playing on UO Gamers: Hybrid (a free Ultima Online shard), someone impersonated my character. In response, I started adding "The Real" prefix to all my character names. It was a defensive move, a way to distinguish myself from the impostor. Eventually, "The Real Gorgan" became another variation I used interchangeably with just "Gorgan."</p>
<p>The irony isn't lost on me that now, decades later, someone else has adopted "RealGorgan" as their handle. History repeating itself, but this time with the roles reversed.</p>
<p>And when I saw it, I knew exactly where to find my proof.</p>
<h2>The Investigation</h2>
<p>I dug through old storage boxes and found it. A dusty 3.5" floppy disk with "UO Screenshots" scrawled on the label in permanent marker. My handwriting from 1999.</p>
<p>Using a USB floppy drive (yes, they still make those), I didn't just copy the files off the disk. I created a complete <strong>forensic disk image</strong> of the entire floppy, capturing every single byte including the filesystem metadata that contains the original timestamps from when the files were created.</p>
<p>The result: a 1.44 MB disk image file that's a perfect snapshot of that floppy disk as it existed 26 years ago.</p>
<p><a href="https://gorgan.dev/user/themes/gorgan/images/gorgan_floppy_1999.img"><strong>Download the forensic disk image</strong></a> (1.44 MB)</p>
<h3>What the Disk Reveals</h3>
<p>The floppy uses an old filesystem called FAT12, which stores the exact date and time each file was created. This metadata is embedded in the raw disk structure itself and can't be faked without changing the entire disk image, which would completely alter its cryptographic fingerprint.</p>
<p>When you examine the disk image, you'll find two JPEG files:</p>
<ul>
<li><p><strong>UO0002.JPG</strong>: Created November 11, 1999 at 5:04:08 PM</p>
</li>
<li><p><strong>UO0006.JPG</strong>: Created November 11, 1999 at 5:07:40 PM</p>
</li>
</ul>
<p>These aren't "modified" dates that could be changed by copying files around. These are the original creation timestamps as written by Windows 98 or Windows ME in 1999, stored in the FAT12 directory structure.</p>
<h3>The Digital Fingerprints</h3>
<p>To prove nothing has been tampered with, here are the SHA-256 hashes (digital fingerprints) of everything:</p>
<p><strong>Disk Image:</strong></p>
<pre><code class="language-plaintext">gorgan_floppy_1999.img
SHA-256: a49f14c99d947a902da7ddd1536d11fe53d010e3cbb84c81c0e356a9dda64331
</code></pre>
<p><strong>Screenshots:</strong></p>
<pre><code class="language-plaintext">UO0002.JPG
SHA-256: fdf4174a9f0cf9bfc0be5c5660644e8bab6ecbe50983d1caf7ebae00a08cea7f

UO0006.JPG
SHA-256: 11ab252c3217ede84e7d48a8b75ac2c3a9cf726ecf4ffd129a30013971855ad5
</code></pre>
<p>If even a single bit were different, these fingerprints would be completely different. The integrity is verifiable.</p>
<h2>The Verdict</h2>
<p>Looking at these screenshots now, I'm hit with a wave of nostalgia. The chunky pixel art. The isometric view. The simple joy of exploring a virtual world when the concept of "virtual worlds" was still new and exciting.</p>
<p>Ultima Online was more than just a game. It was where a generation of us learned about online communities, digital identities, and the strange new frontier of living part of your life on the internet. We were pioneers in a way, figuring out who we wanted to be in digital spaces before those spaces became as real and important as physical ones.</p>
<p>And somewhere in that journey, I became Gorgan. Not because I was trying to build a brand or claim a namespace. Just because I needed a name, and that one felt right.</p>
<p>In a world where digital identity matters more than ever, where your online presence can be as important as your offline one, this floppy disk is more than just nostalgia. It's a time capsule from the early internet, preserved in magnetic oxide on a plastic disk, containing cryptographic evidence that anyone with basic technical knowledge can verify.</p>
<h2>The Record</h2>
<p>I'm not writing this to start a fight with whoever is using "RealGorgan" now. Maybe they didn't know. Maybe they thought the name was available. Maybe they have their own reasons.</p>
<p>But the record should be clear:</p>
<p><strong>The name Gorgan has been mine since November 11, 1999.</strong></p>
<p>Before MySpace. Before Facebook. Before Twitter. Before GitHub. Before "RealGorgan."</p>
<p>Twenty-six years later, it still is.</p>
<hr />
<h2>For the Technically Curious</h2>
<p>If you want to verify the timestamps yourself, you can mount the disk image on Linux, Mac, or Windows. The modification dates are stored at offset 0x2600 in the disk image, in the root directory entries.</p>
<h3>Mounting on Linux/Mac</h3>
<pre><code class="language-bash"># Download the image
wget https://gorgan.dev/user/themes/gorgan/images/gorgan_floppy_1999.img

# Mount it (Linux)
sudo mkdir /mnt/floppy
sudo mount -o loop,ro gorgan_floppy_1999.img /mnt/floppy
ls -la /mnt/floppy

# Mac
hdiutil attach -readonly gorgan_floppy_1999.img
</code></pre>
<p>You should see the two JPEG files with November 11, 1999 timestamps.</p>
<h3>Mounting on Windows</h3>
<p>Windows doesn't natively support mounting raw floppy disk images, but you can use the free ImDisk Toolkit:</p>
<ol>
<li><p>Download and install <a href="http://www.ltr-data.se/opencode.html/#ImDisk">ImDisk Toolkit</a></p>
</li>
<li><p>Right-click the <code>gorgan_floppy_1999.img</code> file</p>
</li>
<li><p>Select "Mount as ImDisk Virtual Disk"</p>
</li>
<li><p>Choose a drive letter and click OK</p>
</li>
<li><p>Open the mounted drive in File Explorer</p>
</li>
<li><p>Right-click on the JPEG files and select Properties to view the creation dates</p>
</li>
</ol>
<p>The timestamps will show as November 11, 1999.</p>
<h3>About the FAT12 Format</h3>
<p>The disk uses the standard FAT12 format used by 1.44MB floppy disks in the 1990s. The date encoding uses a 16-bit format where bits 0-4 are the day, bits 5-8 are the month, and bits 9-15 are the year offset from 1980.</p>
<p>For UO0002.JPG, the raw date word is 0x2796, which decodes to November 11, 1999.</p>
<p><a href="https://gorgan.dev/user/themes/gorgan/images/gorgan_floppy_1999.img">Download the disk image</a> and verify it yourself. The evidence is all there.</p>
]]></content:encoded></item><item><title><![CDATA[Beyond Code: Why Open Source is the Ultimate Expression of Freedom]]></title><description><![CDATA[A Debt of Gratitude
Every modern innovation, from the servers that power the web to the AI models reshaping our future, stands on the shoulders of the open-source community. Before I dive into my own work, I want to take a moment to thank the thousan...]]></description><link>https://blog.gorgan.dev/beyond-code-why-open-source-is-the-ultimate-expression-of-freedom</link><guid isPermaLink="true">https://blog.gorgan.dev/beyond-code-why-open-source-is-the-ultimate-expression-of-freedom</guid><category><![CDATA[liberties]]></category><category><![CDATA[mlkjr]]></category><category><![CDATA[civil rights]]></category><category><![CDATA[history]]></category><category><![CDATA[opensource]]></category><category><![CDATA[freedom]]></category><category><![CDATA[freedom of speech]]></category><category><![CDATA[mlk]]></category><category><![CDATA[Martin Luther King]]></category><dc:creator><![CDATA[The Real Gorgan]]></dc:creator><pubDate>Sun, 18 Jan 2026 09:51:00 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/4kYkKW8v8rY/upload/00d45864bf1ecdc3a150c3fe199a793d.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-a-debt-of-gratitude">A Debt of Gratitude</h1>
<p>Every modern innovation, from the servers that power the web to the AI models reshaping our future, stands on the shoulders of the open-source community. Before I dive into my own work, I want to take a moment to <strong>thank the thousands of developers who contribute their time and expertise to the public good</strong>.</p>
<p>Building for the public isn't just about writing code; it's about creating a digital public good that ensures technology remains accessible, transparent, and collaborative.</p>
<h2 id="heading-my-open-source-journey">My Open Source Journey</h2>
<p>In my own work at <a target="_blank" href="https://gorgan.dev"><em>Gorgan's Lab</em></a>, I've made it a priority to keep the majority of my projects open source. I believe that tools are most powerful when they are shared and refined by the community. From VS Code extensions that enhance developer productivity, to community gaming servers, to communication tools that preserve digital history, these projects embody the philosophy that knowledge and tools should be freely accessible to all.</p>
<h2 id="heading-the-legal-truth-code-is-speech">The Legal Truth: Code is Speech</h2>
<p>The connection between open source and freedom of speech isn't just a metaphor; <strong>it is a legal reality. In the landmark case <em>Bernstein v. United States</em>, the courts recognized that source code is a form of protected expression under the First Amendment.</strong></p>
<p>Just as a poet uses words to convey an idea or a mathematician uses equations to describe the universe, a developer uses code to communicate logic and intent. When we open-source our work, we are exercising our right to "speak" in the language of the 21st century.</p>
<h2 id="heading-why-we-must-protect-this-freedom">Why We Must Protect This Freedom</h2>
<p>The four essential freedoms of free software (the freedom to run, study, change, and redistribute) are increasingly under threat from restrictive licensing, over-regulation, and centralized control.</p>
<p>Protecting open source is about more than just software; <strong>it is about protecting the freedom of speech itself.</strong> If we lose the right to share our code openly, we lose the right to innovate without a gatekeeper. <strong>We must remain vigilant against policies that seek to treat code as a dangerous "munition" rather than the expressive, creative work that it is.</strong></p>
<h2 id="heading-a-call-to-action-learning-from-history">A Call to Action: Learning from History</h2>
<h3 id="heading-as-we-approach-martin-luther-king-jr-day"><strong>As we approach Martin Luther King Jr. Day,</strong></h3>
<p>We are reminded that progress has never come from passivity. <strong>Dr. King's legacy teaches us that remaining idle in the face of injustice is itself a choice, and history has not been kind to those who stood by while freedoms eroded.</strong></p>
<p><strong>Dr. King wrote from Birmingham Jail</strong>: "Injustice anywhere is a threat to justice everywhere." Today, that injustice extends into the digital realm. When code is restricted, when knowledge is gatekept, when technology is controlled by a concentrated few, we face a new form of tyranny. The oligarchic consolidation of technological power threatens the very principles of open innovation and democratic access to knowledge.</p>
<p><strong>History shows us that transformation requires mobilization. The Civil Rights Movement succeeded not because people waited for change, but because they organized, spoke out, and refused to accept the status quo.</strong> Similarly, the open-source movement must be more than a technical preference; it must be a conscious stand against the centralization of power and knowledge.</p>
<p>This MLK Day, we should ask ourselves: <strong>Are we idle observers, or are we active participants in protecting digital freedom? The tools we build, the code we share, and the communities we foster are acts of resistance against a future where innovation is locked behind proprietary walls and controlled by the powerful few.</strong></p>
<h2 id="heading-conclusion">Conclusion</h2>
<p>Open source is a technical expression of democracy. It is a world where the best idea wins, and where everyone has a seat at the table. But democracy requires vigilance, courage, and action.</p>
<p><strong>Let's honor Dr. King's legacy by refusing to be idle.</strong> Let's keep building, keep sharing, and above all, keep protecting our right to speak in code. The fight for digital freedom is the civil rights movement of our time, and history will judge us by whether we stood up or stood by.</p>
]]></content:encoded></item><item><title><![CDATA[Harnessing AI: Transforming Ideas into Reality]]></title><description><![CDATA[When artificial intelligence (AI) first emerged, it sparked a wave of curiosity—from coding projects and culinary recipes to deep philosophical debates. The introduction of AI-powered tools, like GitHub Copilot in Visual Studio Code (VSCode), has ope...]]></description><link>https://blog.gorgan.dev/harnessing-ai-transforming-ideas-into-reality</link><guid isPermaLink="true">https://blog.gorgan.dev/harnessing-ai-transforming-ideas-into-reality</guid><category><![CDATA[Developer]]></category><category><![CDATA[AI]]></category><category><![CDATA[agentic-coding]]></category><category><![CDATA[vscode extensions]]></category><category><![CDATA[VSCode Tips]]></category><category><![CDATA[claude.ai]]></category><category><![CDATA[chatgpt]]></category><dc:creator><![CDATA[The Real Gorgan]]></dc:creator><pubDate>Fri, 16 Jan 2026 05:05:47 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/KrYbarbAx5s/upload/907d3ff8e05af0673aadf56d423a7922.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>When artificial intelligence (AI) first emerged, it sparked a wave of curiosity—from coding projects and culinary recipes to deep philosophical debates. The introduction of AI-powered tools, like <strong>GitHub Copilot</strong> in <strong>Visual Studio Code (VSCode)</strong>, has opened a vast array of new opportunities for developers and enthusiasts alike.</p>
<p>Over the past couple of years, I’ve developed several software projects by leveraging the power of AI. My workflow has evolved significantly:</p>
<ul>
<li><p><strong>The Early Days:</strong> Heavy reliance on copying and pasting between platforms like ChatGPT.</p>
</li>
<li><p><strong>The Shift:</strong> The introduction of Claude made the experience more efficient and rapid.</p>
</li>
<li><p><strong>The Modern Stack:</strong> Using MCP Servers and detailed prompt instructions to build complex systems.</p>
</li>
</ul>
<p>This evolution has made my development process more dynamic and innovative. I’m eagerly anticipating future advancements as I prepare to build even more ambitious projects.</p>
<blockquote>
<p><strong>Join the Conversation:</strong> How do you incorporate AI into your daily life? Do you use it professionally? Has it enabled you to achieve tasks that were previously impossible? I’d love to hear your thoughts in the comments!</p>
</blockquote>
<hr />
<h3 id="heading-tips-for-harnessing-ai-in-your-projects">Tips for Harnessing AI in Your Projects</h3>
<p>If you're looking to integrate AI into your own workflow, here are a few best practices to keep in mind:</p>
<ul>
<li><p><strong>Start Small:</strong> Begin with minor projects to understand the capabilities and limitations of your tools.</p>
</li>
<li><p><strong>Stay Updated:</strong> AI technology is rapidly evolving; keep an eye on new features to maximize your efficiency.</p>
</li>
<li><p><strong>Experiment with Tools:</strong> Explore different platforms to find the specific features that suit your unique workflow.</p>
</li>
<li><p><strong>Leverage Community:</strong> Join forums and social groups to gain insights and support from other builders.</p>
</li>
<li><p><strong>Focus on Data Quality:</strong> Ensure your input data is clean and relevant for more accurate outcomes.</p>
</li>
<li><p><strong>Iterate and Improve:</strong> Be prepared to refine your models and processes based on feedback and results.</p>
</li>
<li><p><strong>Consider Ethics:</strong> Be mindful of privacy, bias, and transparency in your AI solutions.</p>
</li>
<li><p><strong>Document Your Process:</strong> Keep records of challenges and solutions for future collaboration.</p>
</li>
</ul>
<p>By incorporating these tips, you can effectively utilize AI to transform your ideas into reality.</p>
]]></content:encoded></item><item><title><![CDATA[The 23-Minute Tax: Why Your Schedule is Draining You]]></title><description><![CDATA[The Trap of the Micro-Schedule: Finding Flow in Code
Do you ever feel like you wish you could clone yourself to get more work done in a day? Why does there never seem to be enough time in the days to get what we want done?
I often wish I could come u...]]></description><link>https://blog.gorgan.dev/too-many-projects-not-enough-time</link><guid isPermaLink="true">https://blog.gorgan.dev/too-many-projects-not-enough-time</guid><category><![CDATA[Developer]]></category><category><![CDATA[Time management]]></category><category><![CDATA[Mental Health]]></category><dc:creator><![CDATA[The Real Gorgan]]></dc:creator><pubDate>Thu, 15 Jan 2026 07:04:30 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/Bb_gxpV09qk/upload/ce11b21573858013ad6035ad1c65b6e2.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-the-trap-of-the-micro-schedule-finding-flow-in-code">The Trap of the Micro-Schedule: Finding Flow in Code</h1>
<p>Do you ever feel like you wish you could clone yourself to get more work done in a day? Why does there never seem to be enough time in the days to get what we want done?</p>
<p>I often wish I could come up with a strict schedule—you know, the kind where you work on Project A for two hours, then switch to Project B at exactly 11:00 AM. I try to micro-manage my time, but I find it impossible the second I get "lost in the code." Once I'm deep into a problem, that schedule becomes a cage.</p>
<p>Then, when the timer finally goes off, I don't want to stop. I'm in the zone. I've finally built up the mental map of the entire system, and walking away feels like I'm about to let a sandcastle wash away. Do you ever feel that way?</p>
<hr />
<h2 id="heading-the-hidden-cost-of-switching">The Hidden Cost of Switching</h2>
<p>The reason that "strict schedule" feels so wrong is actually scientific. Research shows that it takes a developer about 23 minutes to fully get back into a state of deep focus after a single interruption.</p>
<p>When we try to juggle four projects in one day, we aren't just splitting our time; we are paying a massive "cognitive tax" every time we switch. We spend half our energy just "reloading" the context of the next project into our brains. No wonder we feel like we need a clone—we're losing hours of our day just to the act of switching gears.</p>
<hr />
<h2 id="heading-a-different-way-day-theming">A Different Way: "Day Theming"</h2>
<p>Instead of fighting your brain with hourly schedules, have you ever tried <strong>Day Theming</strong>?</p>
<p>Rather than micro-managing your hours, you manage your days. You dedicate all of Monday to Project A and all of Tuesday to Project B.</p>
<p><strong>Why it works:</strong> It respects your "flow state". You don't have to stop just as you're getting started because you have the whole day ahead of you. It eliminates that "start-up cost" and reduces the decision fatigue of wondering what you should be doing next.</p>
<hr />
<h2 id="heading-the-art-of-the-context-dump">The Art of the "Context Dump"</h2>
<p>But what about those nights when you're deep in the zone and know you should stop, but you're terrified you'll lose your train of thought?</p>
<p>I've started using a "Context Dump" throughout and at the end of the day. Before I stop working on any task or close my IDE, I write a quick, messy note to my future self:</p>
<blockquote>
<p>"I was halfway through the auth refactor. The next step is fixing the JWT expiration logic. Look at the console log on line 42 for the current error."</p>
</blockquote>
<p>It's like leaving a "Save Point" in a video game. It gives me the "permission" to stop working because I know I won't have to spend an hour tomorrow morning figuring out where I left off.</p>
<hr />
<h2 id="heading-protecting-the-lab">Protecting the "Lab"</h2>
<p>At the end of the day, we have to remember that we aren't machines. Working non-stop because we're "in the zone" is a great feeling, but doing it every night is the fastest route to burnout.</p>
<p>The goal isn't just to do more work—it's to do better work without losing ourselves in the process. We need to find ways to honor that "lost in the code" feeling while still making sure we actually have a life outside of the screen.</p>
<p>So, next time you feel like you're drowning in projects... maybe don't reach for a stricter schedule. Try giving yourself more space to stay lost in one thing at a time.</p>
<hr />
<h2 id="heading-tools-to-support-your-flow-vscode-extensions">Tools to Support Your Flow: VSCode Extensions</h2>
<p>To make Day Theming and Context Dumps even easier, I've found two amazing VSCode extensions that keep everything you need right in your editor.</p>
<h3 id="heading-better-sidebar-markdown-notes">🗂️ Better Sidebar Markdown Notes</h3>
<p><strong>Enhanced markdown notes directly in your sidebar</strong></p>
<p>Instead of switching to another app to jot down your context dumps, Better Sidebar Markdown Notes lets you keep your notes right there in VSCode. Create multiple pages, auto-save as you type, and even sync with the cloud for access across devices.</p>
<p><strong>Key Features:</strong></p>
<ul>
<li><p>Multiple pages with easy navigation</p>
</li>
<li><p>GitHub Flavored Markdown support</p>
</li>
<li><p>Auto-save functionality with configurable intervals</p>
</li>
<li><p>Cloud sync support for seamless access anywhere</p>
</li>
<li><p>Advanced backup and restore capabilities</p>
</li>
<li><p>Custom storage locations (workspace or custom path)</p>
</li>
</ul>
<p>This is perfect for those context dumps—quick, messy notes that keep your thoughts from scattering when you need to step away from the keyboard.</p>
<p><a target="_blank" href="https://marketplace.visualstudio.com/items?itemName=TheRealGorgan.better-sidebar-markdown-notes">Install Better Sidebar Markdown Notes →</a></p>
<hr />
<h3 id="heading-google-tasks-for-vscode-with-calendar">✅ Google Tasks for VSCode (with Calendar)</h3>
<p><strong>Manage your Google Tasks directly from VSCode without leaving your editor</strong></p>
<p>Day Theming requires planning, and Google Tasks for VSCode brings your task management seamlessly into your development environment. No context switching, no alt-tabbing to another app—just your tasks and your code side by side.</p>
<p><strong>Key Features:</strong></p>
<ul>
<li><p>View your entire Google Tasks list in the sidebar tree view</p>
</li>
<li><p>Create, edit, and delete tasks without leaving VSCode</p>
</li>
<li><p>Calendar integration to see due dates and events</p>
</li>
<li><p>Real-time sync with your Google Tasks account</p>
</li>
<li><p>Secure OAuth 2.0 authentication</p>
</li>
<li><p>Full CRUD operations for task management</p>
</li>
</ul>
<p>Whether you're organizing your Day Theming schedule or tracking what needs to happen when, this extension keeps your tasks visible and accessible. You can even see which tasks have deadlines coming up—essential when you're dedicating entire days to specific projects.</p>
<p><a target="_blank" href="https://marketplace.visualstudio.com/items?itemName=TheRealGorgan.vscode-google-tasks-extension">Install Google Tasks for VSCode →</a></p>
<hr />
<h2 id="heading-the-complete-flow">The Complete Flow</h2>
<p>Here's how they work together:</p>
<ol>
<li><p><strong>Plan your week with Google Tasks</strong> — Use the extension to organize which project gets which day</p>
</li>
<li><p><strong>Dive deep into your day</strong> — Let yourself get lost in the code, knowing interruptions are minimized</p>
</li>
<li><p><strong>Document your progress</strong> — Use Markdown Notes for quick context dumps before you leave</p>
</li>
<li><p><strong>Pick up seamlessly tomorrow</strong> — Your notes are waiting, your tasks are organized, and you're ready to dive back in</p>
</li>
</ol>
<p>The beauty of these tools is that they live right in your editor—no tab switching, no mental friction. Just you, your code, and the tools that help you stay focused.</p>
<hr />
<p>What do you think? Does the idea of Day Theming feel like it would give you more freedom, or does it sound just as scary as a strict schedule? And have you found tools that help you protect your flow state?</p>
]]></content:encoded></item></channel></rss>