<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/">
    <channel>
        <title>Victor Frye Blog</title>
        <link>https://victorfrye.com/blog</link>
        <description>The personal blog of your friendly neighborhood developer, Victor Frye.</description>
        <lastBuildDate>Sat, 21 Feb 2026 23:28:33 GMT</lastBuildDate>
        <docs>https://validator.w3.org/feed/docs/rss2.html</docs>
        <generator>https://github.com/jpmonette/feed</generator>
        <language>en-US</language>
        <ttl>60</ttl>
        
        <copyright>© Victor Frye 2026</copyright>
        <item>
            <title><![CDATA[The Copilot Experiment]]></title>
            <link>https://victorfrye.com/blog/posts/copilot-experiment</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/copilot-experiment</guid>
            <pubDate>Wed, 04 Feb 2026 05:00:00 GMT</pubDate>
            <description><![CDATA[A blog post exploring my two-week experiment using GitHub Copilot CLI for all coding tasks as a generative AI skeptic.]]></description>
            <content:encoded><![CDATA[<p>I am a generative AI skeptic. I have been enthusiastic about AI since my college days studying deep learning and linear regression models. However, the hype around generative AI has made me cautious, with a blend of optimism and skepticism. I first started using GitHub Copilot in chat interfaces and IDE plugins to understand JavaScript as a front-end novice. Agentic tooling has advanced rapidly since then and many developers including myself haven't fully embraced it yet.</p>
<p>Two weeks ago, I decided to start an experiment: use newer tooling like Copilot CLI with a higher request limit for all my coding tasks for two weeks. This includes planning, coding, debugging, testing, and documentation. The goal is to see if I can level up my productivity in a meaningful way or not. This post will cover my experiment, findings, and future usage plans.</p>
<h2 id="setting-up-copilot-cli"><a aria-hidden="true" tabindex="-1" href="#setting-up-copilot-cli"><span class="icon icon-link"></span></a>Setting up Copilot CLI</h2>
<p>To start, I installed the <a href="https://github.com/github/copilot-cli">Copilot CLI</a> on my development machine. I chose GitHub Copilot because I already had a subscription and approved usage for related projects. Claude Code and OpenAI Codex are also great options I wish to explore and have enough familiarity to say they wouldn't alter the results of this experiment significantly. The commands, available features, and limitations are similar enough across these tools.</p>
<p>After installation, the first step was to run Copilot. This is the command-line tool. It isn't your IDE with a chat interface, it is a Terminal-based user interface (TUI). Launching Copilot is now a command I have memorized:</p>
<pre><code class="language-bash">copilot
</code></pre>
<p>I fixate on how we launch Copilot because this TUI is where I spent most of my development time during this experiment. I primarily write .NET and TypeScript code, but I don't launch Visual Studio. Even VS Code is closed most of my day. To learn this habit and fully give agentic AI a chance, I committed hard.</p>
<p><img src="/assets/_blog/copilot-experiment/copilot-tui.png" alt="Copilot CLI TUI"></p>
<h2 id="a-new-workflow"><a aria-hidden="true" tabindex="-1" href="#a-new-workflow"><span class="icon icon-link"></span></a>A new workflow</h2>
<p>Starting with Copilot CLI also meant rethinking my inner loop. Normally, my process is as follows:</p>
<ol>
<li>Open work item in backlog (Azure Boards, GitHub Issues, etc.)</li>
<li>Review requirements, acceptance criteria, and make a mental plan</li>
<li>Open IDE and explore existing code (Visual Studio, VS Code, etc.)</li>
<li>Set up local environment (if applicable)</li>
<li>Author code changes</li>
<li>Validate changes (build, test, debug, etc.)</li>
<li>Repeat writing and validating steps until done</li>
<li>Write documentation (if applicable)</li>
<li>Commit and push changes</li>
<li>Self-review and create pull request for team review</li>
</ol>
<p>This was my inner loop for most coding tasks. I didn't plan what this new process was going to look like exactly, but I knew Copilot CLI would be central. With a first task selected, I repeated steps 1 and 3. Then, I switched to running <code>copilot</code> in my terminal.</p>
<p>At this point, everything is new, but I had some advice. "Start with plan". Copilot and similar tools work best with a lot of context. The more context you provide, the better the results. Therefore, I started by writing a plan in the Copilot TUI. This is done by typing <code>/plan</code> and then describing what I wanted to accomplish. For example:</p>
<pre><code class="language-bash">/plan

This codebase has ABC services in the solution. We are working in .NET 10. I want to add the ability to do XYZ.
The requirements are 1, 2, and 3. The acceptance criteria are A, B, and C. We will need a new endpoint to handle this and are using Minimal APIs.
I also want to ensure we have unit tests using xUnit v3 with Microsoft.Testing.Platform and Moq for mocking dependencies.
Finally, I want to update the README to document this new feature.

Ask me any clarifying questions for more context.
</code></pre>
<p>Obviously I can't share the exact details of a prompt for proprietary code, but you can get the general idea. Some callouts though: (1) we spend a lot of time describing the codebase including tech stack and specific libraries to use and (2) we include in the plan prompt that we want Copilot to ask us questions. The first I quickly learned was unnecessary, kinda. We can add context files that provide much of this information for us systematically. The second, asking the agent to ask questions, is vital. These first stage questions have become a key part of my workflow. No code gets written by me or Copilot until we have gone back and forth a few times clarifying requirements and constraints. It helps me recognize if I missed requirements or edge cases or just how I want the code to be structured. At the end of the question phase, we have a solid plan. Literally. Pressing <code>Ctrl+Y</code> in the TUI pulls up a markdown file with summaries, todos, code segments, and more. During this phase, Copilot is reading and understanding the codebase and generating a documented plan that both of us can agree on and refer back to during the rest of the development loop.</p>
<blockquote>
<p>Always start with a plan.</p>
</blockquote>
<p>Note again, at this point even in my first session I haven't written any code. We've authorized Copilot a few times to read files, but no changes have been made. After this, we switch out of plan mode with <code>Shift+Tab</code> which I quickly learn to memorize for going back and forth between modes. Now, we say the magic words:</p>
<pre><code>Do it
</code></pre>
<p>I think the formal words are "start implementation" or something, but I've come to love typing "Do it" and imagining that Darth Sidious meme as I'm commanding my botchild.</p>
<p>And it goes. Copilot starts requesting authorizations to write and edit files, to run <code>dotnet build</code> and <code>dotnet test</code>, and more. It writes code, runs tests, debugs issues, and even writes documentation. All while I sit back. On very rare occasions, I send a steering prompt when I notice it spinning in circles or going off track. However, I also quickly learn that letting Copilot reach a point of conclusion and then replanning is less disruptive. It's not perfect, but I've quickly found focusing on that plan stage and spending more time there yields better results. I have also gotten bored and less concerned with permissions, so I've changed my launch command:</p>
<pre><code class="language-bash">copilot --yolo
</code></pre>
<p>I adore that <code>--yolo</code> flag because YOLO, but also because it enables all permissions by default. This means I spend less time approving actions like <code>dotnet build</code> and get faster results. You may not want to start here, but after refining my planning abilities and adding context I found the restrictions unnecessary.</p>
<h2 id="the-model-that-sings"><a aria-hidden="true" tabindex="-1" href="#the-model-that-sings"><span class="icon icon-link"></span></a>The model that sings</h2>
<p>Before starting this experiment, I had limited requests available and defaulted to <code>claude-haiku-4.5</code> frequently for my chat-based AI needs. Haiku is cheap and I focus on small sections, rarely needing large context. It worked. Haiku didn't for this. More advice came into play here: "Use the big one".</p>
<p>I switched to <code>claude-opus-4.5</code> for almost all my sessions. I've started to pull back and use Sonnet and Haiku again at certain occasions, but if you repeat this experiment I highly recommend starting with the biggest cloud model you have access to. The larger context windows and more advanced reasoning capabilities make a huge difference. I noticed this immediately during planning as Opus would note code duplication or architectural patterns I didn't mention and prevent replanning. During implementation, Opus just performed better. The code was cleaner, aligned to our best practices, and required less steering. The debugging was more accurate too. Overall, Opus was the model for me.</p>
<h2 id="automating-context"><a aria-hidden="true" tabindex="-1" href="#automating-context"><span class="icon icon-link"></span></a>Automating context</h2>
<p>Copilot can use your existing documentation for retrieving context. Developers either love or neglect their READMEs, wikis, and code comments. I can author an amazing README, but I hate wikis and neglect inline documentation. However, READMEs are for people. Agents love a different set of documents: copilot-instructions.md, AGENTS.md, CLAUDE.md, etc. Given the various tools available, I like a consistent approach. I wanted to adopt an AGENTS.md file for my projects. Copilot can help with this. Copilot offers a <code>/init</code> command that scans your codebase and generates its preferred <code>copilot-instructions.md</code> file under the <code>.github</code> directory. I ran this, then did what I love and started a plan mode. In this plan mode I asked Copilot to generate an <code>AGENTS.md</code> file based on the generated instructions and reference this in the copilot-instructions.md. I also ideated some other useful context like common <code>dotnet</code> commands to run, core libraries we are using that I noticed it spinning on or diverging from, and behavior preferences like using conventional commits. After planning, I said "Do it" and let Copilot modify and generate our context files. This took only a few minutes and it removes a lot of upfront planning time in future sessions. Now, every time I plan with Copilot I don't need to remind it that Microsoft.Testing.Platform is our test runner and has different filter options or correct it when it uses NSubstitute instead of Moq. The context files handle this for me.</p>
<h2 id="model-context-protocol"><a aria-hidden="true" tabindex="-1" href="#model-context-protocol"><span class="icon icon-link"></span></a>Model context protocol</h2>
<p>I just mentioned Microsoft.Testing.Platform (MTP). We are utilizing this as our new test runner and its new. Most current models are trained with VSTest runner as the only option, or at least the most common. Therefore, I noticed Copilot frequently using options and flags that don't work, wasting my time and tokens. To solve this, I leaned into another AI innovation: model context protocol (MCP). Copilot allows us to add MCP servers through its TUI with the <code>/mcp add</code> command. In my case as a .NET and Azure developer, I added the <a href="https://github.com/microsoftdocs/mcp">Microsoft Learn MCP</a>. With this, during my AGENTS.md planning I told Copilot to "fetch the docs" related to MCP and note filter options and create a cheat sheet for itself. Since doing this, Copilot has stopped spinning on running tests. This has saved me so much time and frustration I looked at other MCP servers to add. I found an <a href="https://aspire.dev/get-started/configure-mcp/">Aspire MCP</a> and <a href="https://github.com/microsoft/playwright-mcp">Playwright MCP</a> that I have added that have both improved my experience. It's embarrassing to say how long it took me to start utilizing MCP servers, but now that I have I can't imagine going back.</p>
<h2 id="my-findings"><a aria-hidden="true" tabindex="-1" href="#my-findings"><span class="icon icon-link"></span></a>My findings</h2>
<p>At this point, I've mostly internalized this new workflow:</p>
<ol>
<li>Open work item in backlog (Azure Boards, GitHub Issues, etc.)</li>
<li>Review requirements and acceptance criteria</li>
<li>Launch Copilot CLI</li>
<li>Collaborate on plan with Copilot until ready</li>
<li>Say "Do it" to start implementation</li>
<li>Review code changes, provide feedback, and replan as needed</li>
<li>Run system and do product testing</li>
<li>Create a pull request for team review</li>
</ol>
<p>This workflow isn't perfect, but I find myself at the end of this experiment with a clear takeaway: using Copilot CLI has leveled up my productivity in a meaningful way. It's not automating my job, but it's morphing my inner loop. I still have to have refined work to do. I still have to understand how those requirements might be implemented in complex systems, how they interact with existing code, and what edge cases to consider. However, my mental models are shifting to markdown files, plans, and prompts. I'm making high-level decisions about what code to write, then letting Copilot write the code. I review, I steer, I still test and run the application, I'm just less concerned with language syntax, boilerplate, and if I missed a semicolon. I recall my year as a product owner and feel like I've shifted halfway in-between. Programming is getting automated, but software engineering is still very much a part of the job.</p>
<p>At the end of the two weeks, I reflect now and have to answer if I will continue using Copilot CLI. The answer is a resounding yes. This is my new normal. I have plans to see if additional MCPs are available for other areas of the tech stack. I want to explore subagents for segmenting context and tasks. I need to refine and improve my new developer experience. I find myself taking another step away from skepticism and towards curiosity. Software engineering isn't going away, but the way we do it is changing.</p>
<p>I started a skeptic. Now, I'm a believer.</p>]]></content:encoded>
            <category>ai</category>
            <category>copilot</category>
            <enclosure url="https://victorfrye.com/assets/_blog/copilot-experiment/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Hello 2026]]></title>
            <link>https://victorfrye.com/blog/posts/hello-2026</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/hello-2026</guid>
            <pubDate>Mon, 02 Feb 2026 05:00:00 GMT</pubDate>
            <description><![CDATA[Hello 2026! We welcome the new year with setting new goals and plans for personal and professional growth.]]></description>
            <content:encoded><![CDATA[<p>Hello 2026! I had hoped to launch this blog post earlier, but the start of the year has been busy busy busy. And that fact is a large part of this post. I wrote a <a href="/blog/posts/2025-wrapped">wrap-up of 2025</a> reviewing my goals and accomplishments last year, but with reflection we also get the chance to reset and pivot. This post will cover my thoughts and where we are at the start of 2026. It will conclude with what those thoughts turn into as actionable goals for the year ahead.</p>
<h2 id="reflecting-on-2025"><a aria-hidden="true" tabindex="-1" href="#reflecting-on-2025"><span class="icon icon-link"></span></a>Reflecting on 2025</h2>
<p>I didn't accomplish everything I set out to in 2025. I also accomplished a lot. 2025 was the year I broke out in community contributions. I presented 16 sessions at conferences, user groups, and at guilds within <a href="https://leadingedje.com">Leading EDJE</a>. I launched my blog here and posted 11 articles covering Aspire, .NET, and various DevOps technologies. I made new friends at <a href="https://softwaregr.com">SoftwareGR</a>, <a href="https://beercitycode.com">Beer City Code</a>, the <a href="https://www.meetup.com/grand-rapids-microsoft-azure-meetup-group">West Michigan Azure User Group</a>, and multiple other Michigan and greater Midwest developer communities. I was promoted to a Senior Solutions Developer II at my firm, Leading EDJE and took on new responsibilities as a DevOps specialist. Many of these accomplishments pushed me beyond my previous limits in my professional life. They already are opening me to new opportunities this new year. I also fell short a few times too, chances to learn and grow.</p>
<p>My greatest accomplishments last year were none of these. It was buying my first house with my wife and settling into a new home together. That itself sounds major, but it happened suddenly and came with much support. It also was despite a year I neglected many personal goals. That is one area I will do better this year.</p>
<h2 id="the-2026-pivot"><a aria-hidden="true" tabindex="-1" href="#the-2026-pivot"><span class="icon icon-link"></span></a>The 2026 pivot</h2>
<p>With reflection comes the chance to pivot. I have decided to make some changes in 2026. Some are small, some are large. Some are personal, some are professional. Some are public, some are private. Some are continuations, some are new.</p>
<p>When setting goals, I start first with what I want to be doing in the year. When I sit down with myself and ask what I enjoy, I come to the focus of community, learning, and fun. I didn't start with these kinds of buzzwords, instead I find them. I list my passions and start grouping them together.</p>
<p>I get to community by reflecting on the professional successes last year. The greatest new opportunities came from meeting community leaders at Beer City Code and West Michigan Azure User Group here in Grand Rapids. I love my hometown and want to invest in it so the developer community in GR can thrive. I also want to continue to travel and connect with the greater developer community across the Midwest and beyond. Learning is a natural extension of software engineering. The field is always changing, and staying current is vital. Generative AI is an industry wide shift. I also have upskilling opportunities with cloud-native development and DevOps practices.</p>
<p>Fun is what I neglected last year. December was rough and revealed how pushing too hard without balance can throw life out of whack. I want a solid foundation of personal health, relationships, and hobbies to balance my professional life. This means more intentional time with my wife, friends, and myself. I want to play more games, read more books, and run more miles. These are all things that bring me joy and recharge my batteries.</p>
<p>With these three pillars in mind, I can now set my overarching goal for 2026:</p>
<blockquote>
<p>In 2026, I will prioritize community, learning, and fun to achieve a balanced and fulfilling year.</p>
</blockquote>
<p>From this, I now break down this goal with quantifiable milestones.</p>
<h3 id="objective-1-community"><a aria-hidden="true" tabindex="-1" href="#objective-1-community"><span class="icon icon-link"></span></a>Objective 1: Community</h3>
<p>Community is about connection. I want to grow the Grand Rapids developer community. I also want to have real conversations with others. One personal belief is the best way to connect is over food and drink. Sharing a meal or conversation over coffee establishes real relationships. I also got some amazing invites from new friends last year that enable aggressive goal opportunities this year. To these ends, I set the following community goals for 2026:</p>
<table>
<thead>
<tr>
<th>Key result</th>
<th>Priority</th>
</tr>
</thead>
<tbody>
<tr>
<td>Be an active organizer for the West Michigan Azure User Group</td>
<td>High</td>
</tr>
<tr>
<td>Be an active organizer for Beer City Code conference</td>
<td>High</td>
</tr>
<tr>
<td>Meet and connect 1:1 with ten local developers over coffee or a meal.</td>
<td>Medium</td>
</tr>
<tr>
<td>Present 10 public sessions at local or regional conferences, user groups, or guilds.</td>
<td>Low</td>
</tr>
</tbody>
</table>
<h3 id="objective-2-learning"><a aria-hidden="true" tabindex="-1" href="#objective-2-learning"><span class="icon icon-link"></span></a>Objective 2: Learning</h3>
<p>Learning is about growth. I want to continue to grow my skills as a developer. Learning itself is hard to quantify, but I can set goals around outcomes that naturally follow learning. Certifications provide a formal validation of learning. I also tend to write posts as I learn. I was torn with grouping for blog posts as I also write to contribute to the community, but the reality is my blog content is about sharing what I learn with readers. Therefore, I set the following learning goals for 2026:</p>
<table>
<thead>
<tr>
<th>Key result</th>
<th>Priority</th>
</tr>
</thead>
<tbody>
<tr>
<td>Earn the Microsoft DevOps Engineer Expert certification.</td>
<td>High</td>
</tr>
<tr>
<td>Write and publish 20 posts on my blog</td>
<td>High</td>
</tr>
<tr>
<td>Earn cloud specialist recognition from Leading EDJE</td>
<td>Medium</td>
</tr>
<tr>
<td>Earn the Microsoft Azure Solutions Architect Expert certification.</td>
<td>Low</td>
</tr>
</tbody>
</table>
<h3 id="objective-3-fun"><a aria-hidden="true" tabindex="-1" href="#objective-3-fun"><span class="icon icon-link"></span></a>Objective 3: Fun</h3>
<p>Fun is about living. Last year, I started running including my first formal race. This year, I want to resume running and push myself toward sustainable fitness. I also love video games. I want to play more with less guilt. Reading is another hobby I neglected but love. Those cover personal fun, often in solitude. However, I have a wife I deeply love. I also own a house now! I've found joy in hosting friends and family. I also want to invest in this home we are building. Finally, I want to travel unrelated to developer conferences. Exploring new places with my wife is a joy I want to prioritize. Therefore, I set the following fun goals for 2026:</p>
<table>
<thead>
<tr>
<th>Key result</th>
<th>Priority</th>
</tr>
</thead>
<tbody>
<tr>
<td>Host 6 gatherings at our new home with friends or family</td>
<td>High</td>
</tr>
<tr>
<td>Complete 4 major home improvement projects</td>
<td>High</td>
</tr>
<tr>
<td>Play 25 new video games</td>
<td>Medium</td>
</tr>
<tr>
<td>Run a 10K race</td>
<td>Medium</td>
</tr>
<tr>
<td>Take a vacation trip with my wife to a new destination</td>
<td>Medium</td>
</tr>
<tr>
<td>Read 8 books</td>
<td>Low</td>
</tr>
</tbody>
</table>
<h2 id="ready-set-go"><a aria-hidden="true" tabindex="-1" href="#ready-set-go"><span class="icon icon-link"></span></a>Ready, set, go</h2>
<p>With these goals set, I feel ready to tackle 2026. Most of these are already in progress. Altogether, they feel ambitious. They are also achievable. Many of these are metrics I have already achieved in the last year and only ask to do them again. Some are new challenges that will push me. But one missed fact in goal-setting is that goals are not set in stone. If I find myself overwhelmed or under-challenged, I can always pivot, adjust, or scrap goals as needed. What is important is that I have a north star to guide me through the year. With that, I am excited to see what 2026 has in store.</p>
<p>Here is to a 2026 filled with community, learning, and fun. Cheers!</p>]]></content:encoded>
            <category>career</category>
            <category>events</category>
            <category>goals</category>
            <enclosure url="https://victorfrye.com/assets/_blog/2026-hello/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Docker Desktop to Podman]]></title>
            <link>https://victorfrye.com/blog/posts/docker-to-podman-windows-migration</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/docker-to-podman-windows-migration</guid>
            <pubDate>Wed, 21 Jan 2026 05:00:00 GMT</pubDate>
            <description><![CDATA[A quick guide for Windows developers to transition from Docker Desktop to Podman.]]></description>
            <content:encoded><![CDATA[<p>Docker is synonymous with containers. For years, Docker Desktop has been the go-to solution for Windows developers to build, run, and manage containers locally without complex setups and for ease of use. However, alternatives have emerged and matured and Docker Desktop licensing has stopped being free for many users and enterprises. One such alternative is Podman, a daemonless container engine that offers Docker compatibility and a similar hassle-free experience for Windows developers. This post will guide you through the steps to migrate from Docker Desktop to Podman on Windows.</p>
<h2 id="docker-licensing"><a aria-hidden="true" tabindex="-1" href="#docker-licensing"><span class="icon icon-link"></span></a>Docker licensing</h2>
<p>There are many reasons <a href="https://docker.com/products/docker-desktop/">Docker Desktop</a> remains a staple for Windows developers. There's even more for why Docker became the de facto standard for containers. However, <a href="https://docker.com/pricing/">Docker Desktop licensing has stopped being free</a> for many users and enterprises. This change did not include the open-source Docker Engine, which remains free to use and includes the command-line interface, as well as Docker Compose. This change disproportionately impacts Windows users. On macOS and Linux, Docker works seamlessly due to Unix-like operating systems and Linux container compatibility. On Windows, Windows Subsystem for Linux (WSL) is the underlying technology that makes the Docker engine work and requires a lot of complex setup to interact with across Windows and WSL. Docker Desktop abstracts this complexity away.</p>
<p>What always bothered me about moving away from Docker Desktop was the lack of a simple solution that provided a similar onboarding experience. Enter <a href="https://podman.io">Podman</a>.</p>
<h2 id="why-podman"><a aria-hidden="true" tabindex="-1" href="#why-podman"><span class="icon icon-link"></span></a>Why Podman?</h2>
<p>Podman is an open-source container engine that provides a Docker-compatible CLI and API. It is sponsored by <a href="https://redhat.com">Red Hat</a> and has gained significant traction in the container ecosystem. Podman offers several advantages over Docker Desktop, but the most compelling are compliance and parity. You can utilize Podman without worrying about licensing fees, and with a quick series of steps, you can get the same benefits of Docker Desktop including a GUI, WSL integration, Docker CLI compatibility, and container management without the cost.</p>
<p>So how do you get started?</p>
<h2 id="set-up-podman-on-windows"><a aria-hidden="true" tabindex="-1" href="#set-up-podman-on-windows"><span class="icon icon-link"></span></a>Set up Podman on Windows</h2>
<p>Some assumptions before we begin:</p>
<ul>
<li>You are running Windows 10 or later.</li>
<li>You already have WSL installed and set up on your Windows machine. If not, you can follow the <a href="https://learn.microsoft.com/en-us/windows/wsl/install">Microsoft documentation</a> to get started.</li>
<li>You have Docker Desktop installed currently. If not, you can skip step 1 below.</li>
</ul>
<h3 id="1-uninstall-docker-desktop"><a aria-hidden="true" tabindex="-1" href="#1-uninstall-docker-desktop"><span class="icon icon-link"></span></a>1. Uninstall Docker Desktop</h3>
<p>This is the most important step. If you have Docker Desktop installed, it may conflict with Podman. It may also make you non-compliant with Docker's licensing terms. Uninstall Docker Desktop completely. One way to do this is through WinGet, the Windows package manager, if you installed it that way or via the Microsoft Store. You can run the following command in a terminal:</p>
<pre><code class="language-bash">winget uninstall Docker.DockerDesktop
</code></pre>
<h3 id="2-install-podman-and-related-tools"><a aria-hidden="true" tabindex="-1" href="#2-install-podman-and-related-tools"><span class="icon icon-link"></span></a>2. Install Podman and related tools</h3>
<p>There are multiple ways to install Podman on Windows. I recommend using WinGet because command-line tools are precise and repeatable. I also just love using the terminal.</p>
<p>There are additional tools that we will install alongside Podman to provide a similar experience to Docker Desktop. These include:</p>
<ul>
<li><strong>Podman</strong>: The core container engine and podman CLI.</li>
<li><strong>Podman Desktop</strong>: A GUI that provides similar administrative capabilities to Docker Desktop and orchestrates Docker compatibility.</li>
<li><strong>Docker CLI</strong>: The open-source and free Docker CLI for continued compatibility with <code>docker</code> CLI commands.</li>
<li><strong>Docker Compose</strong>: The open-source tooling for multi-container orchestration, powering <code>docker compose</code> and <code>podman compose</code> commands (optional).</li>
<li><strong>Kubectl</strong>: The Kubernetes CLI for managing and interacting with Kubernetes clusters (optional).
Each of these maps directly to a WinGet package and is easily identifiable by ID. If you do not want to install any of the optional tools, simply omit them from the commands below. To install all of these tools, run the following commands in a terminal:</li>
</ul>
<pre><code class="language-bash">winget install --id RedHat.Podman --exact
winget install --id RedHat.Podman-Desktop --exact
winget install --id Docker.DockerCLI --exact
winget install --id Docker.DockerCompose --exact
winget install --id Kubernetes.kubectl --exact
</code></pre>
<p>After installation, you should be able to run <code>podman version</code> in your terminal to verify Podman is installed correctly. You can also launch Podman Desktop and see the desktop application GUI.</p>
<h3 id="3-enable-docker-compatibility-in-podman-desktop"><a aria-hidden="true" tabindex="-1" href="#3-enable-docker-compatibility-in-podman-desktop"><span class="icon icon-link"></span></a>3. Enable Docker compatibility in Podman Desktop</h3>
<p>Okay, now we have all of our tools installed. The next step is to enable Docker compatibility in Podman Desktop. This will allow all Docker tools to utilize the Podman engine. Unfortunately, you have to leave the CLI to do this.</p>
<ol>
<li>Open <strong>Podman Desktop</strong></li>
<li>Go to <strong>Settings</strong> on the navigation bar.</li>
<li>Expand the <strong>Preferences</strong> section.</li>
<li>Click on <strong>Docker Compatibility</strong>.</li>
<li>Toggle <strong>Enabled</strong> to on for Docker compatibility.</li>
</ol>
<p>After this, you should have a new <strong>Docker Compatibility</strong> section which shows system socket status, Podman Compose CLI support and Docker CLI context. That Docker CLI context is very important. It means any <code>docker</code> commands you run in your terminal will now utilize Podman as the backend engine! This is the magic that makes the transition seamless.</p>
<h3 id="4-verify-the-setup"><a aria-hidden="true" tabindex="-1" href="#4-verify-the-setup"><span class="icon icon-link"></span></a>4. Verify the setup</h3>
<p>At this point, assuming installations succeeded and no UI changes confounded you, you should be ready to go! Let's verify everything is working correctly with some simple commands:</p>
<p>First, let's check the Podman version:</p>
<pre><code class="language-bash">podman version
</code></pre>
<p>You will notice output that includes a <code>Client</code> and <code>Server</code> section, similar to Docker. Most important to note is the server version. Podman Desktop runs the podman machine in WSL for us. If you are seeing issues here or with podman commands, you may want to restart your computer and ensure Podman Desktop is auto-running the podman machine. You can check this with the following command:</p>
<pre><code class="language-bash">podman machine list
</code></pre>
<p>This command should show a running machine named <code>podman-machine-default</code> and I would expect the last up time to be <code>Currently running</code>.</p>
<p>Now, let's verify Docker CLI compatibility by running:</p>
<pre><code class="language-bash">docker version
</code></pre>
<p>This should output two sections again: <code>Client</code> and <code>Server</code>. The server section should match the Podman server version from earlier. This means your Docker CLI is successfully talking to the Podman engine!</p>
<p>The last thing to do is verify we can pull and run a container. You can use any hello world, but I like using the Microsoft MCR Hello World image. Let's do it:</p>
<pre><code class="language-bash">docker run mcr.microsoft.com/mcr/hello-world:latest
</code></pre>
<p>You should see output similar to the following:</p>
<pre><code class="language-text">Hello from Docker!
This message shows that your installation appears to be working correctly.

To generate this message, Docker took the following steps:
 1. The Docker client contacted the Docker daemon.
 2. The Docker daemon pulled the "hello-world" image from the Docker Hub.
    (amd64)
 3. The Docker daemon created a new container from that image which runs the
    executable that produces the output you are currently reading.
 4. The Docker daemon streamed that output to the Docker client, which sent it
    to your terminal.

To try something more ambitious, you can run an Ubuntu container with:
 $ docker run -it ubuntu bash

Share images, automate workflows, and more with a free Docker ID:
 https://hub.docker.com/

For more examples and ideas, visit:
 https://docs.docker.com/get-started/
</code></pre>
<p>In our case, the Docker client actually contacted the Podman engine. You can also skip the <code>docker</code> CLI and run the same command with <code>podman</code>:</p>
<pre><code class="language-bash">podman run mcr.microsoft.com/mcr/hello-world:latest
</code></pre>
<h2 id="the-end-result"><a aria-hidden="true" tabindex="-1" href="#the-end-result"><span class="icon icon-link"></span></a>The end result</h2>
<p>At this point, you have successfully migrated from Docker Desktop to Podman on Windows! You can continue using all your existing Docker CLI commands and workflows, but now with Podman as the backend engine. You also have a GUI with Podman Desktop that provides similar functionality to Docker Desktop.</p>
<p>For additional reading about containers, consider learning about <a href="/blog/posts/multi-stage-docker-dotnet-guide">multi-stage builds</a>. Otherwise, happy containerizing with Podman!</p>]]></content:encoded>
            <category>cloudnative</category>
            <category>devops</category>
            <category>docker</category>
            <category>podman</category>
            <category>windows</category>
            <enclosure url="https://victorfrye.com/assets/_blog/docker-to-podman-windows-migration/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[CodeMash 2026]]></title>
            <link>https://victorfrye.com/blog/posts/codemash-2026</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/codemash-2026</guid>
            <pubDate>Tue, 20 Jan 2026 05:00:00 GMT</pubDate>
            <description><![CDATA[Thank you CodeMash 2026! Thank you to the attendees, organizers, sponsors, and my fellow speakers for making this event a success.]]></description>
            <content:encoded><![CDATA[<p>Thank you to the greater Midwest developer community for an exceptional CodeMash 2026! This year, I was honored to speak at this incredible event for the first time, and it was such a joy to attend and connect with so much of the talent and passion from across the region. CodeMash marks the largest conference I have attended to date, and the energy and culture of the event were truly inspiring. I met so many new friends and saw many familiar faces as well. For the first time ever, I met someone who discovered me from this blog in person. I love the Midwest developer community and look forward to returning in the future. Cheers!</p>
<p>This blog post is a reference for those who attended my presentations as well as a thank you letter to everyone involved.</p>
<h2 id="ai-observability-and-evaluation-workshop"><a aria-hidden="true" tabindex="-1" href="#ai-observability-and-evaluation-workshop"><span class="icon icon-link"></span></a>AI Observability and Evaluation workshop</h2>
<p>On Wednesday, I had the pleasure of co-hosting and supporting my friend <a href="https://www.linkedin.com/in/matteland">Matt Eland</a> in his four-hour workshop titled "Testing, Evaluating, and Monitoring AI agents using AI Evaluation and OpenTelemetry". This workshop was a deep dive into AI evaluators leveraging <code>Microsoft.Extensions.AI.Evaluation</code> along with OpenTelemetry and Aspire for observability. AI evaluators are going to be vital for building responsible and quality AI solutions moving forward, and the amount of effort putting together any workshop is no small feat. Thank you to Matt for inviting me to assist and for all the hard work he put into making this workshop a success.</p>
<p>One exciting callout is how this workshop leveraged Aspire to orchestrate the entire workshop solution so attendees could focus on the evaluation code itself instead of the scaffolding of multiple lessons as code.</p>
<p><img src="/assets/_blog/codemash-2026/workshop_dashboard.jpg" alt="AI Observability and Evaluation workshop Aspire dashboard"></p>
<p>Some additional helpful links and resources from the workshop include:</p>
<ul>
<li><a href="https://github.com/integerman/aiobservabilityandevaluationworkshop">Workshop repository</a></li>
<li><a href="https://learn.microsoft.com/en-us/dotnet/ai/evaluation/libraries">Microsoft.Extensions.AI.Evaluation documentation</a></li>
<li><a href="https://blog.leadingedje.com/post/ai/evaluation.html">An LLM Evaluation Framework for AI Systems Performance blog post by Matt Eland</a></li>
<li><a href="https://blog.leadingedje.com/post/ai/evaluationreporting.html">Tracking AI System Performance using AI Evaluation Reports blog post by Matt Eland</a></li>
</ul>
<h2 id="diet-docker"><a aria-hidden="true" tabindex="-1" href="#diet-docker"><span class="icon icon-link"></span></a>Diet Docker</h2>
<p>I also presented a general session on Thursday titled "Diet Docker: Crafting lightweight containers with multi-stage builds". The session focused on an overview of container build tools, a quick dive into multi-stage builds, and impacts of Linux distributions on container size and security. I used .NET containers for sample apps due to my own familiarity, how well .NET build lifecycle aligns with multi-stage builds, and the variety of official .NET container images available. For those interested in learning more or finding the resources available, I recommend reading more here on my blog and following along as I continue to explore containers. Some helpful links and resources include:</p>
<ul>
<li><a href="https://github.com/victorfrye/presentations/blob/main/files/2026/codemash/dietdocker.pdf">Presentation slide deck</a></li>
<li><a href="https://github.com/victorfrye/hellodocker">Hello Docker repository</a></li>
<li><a href="/blog/posts/multi-stage-docker-dotnet-guide">Multi-Stage Docker Builds blog post</a></li>
<li><a href="https://blog.leadingedje.com">Leading EDJE blog</a></li>
</ul>
<h2 id="leading-edje"><a aria-hidden="true" tabindex="-1" href="#leading-edje"><span class="icon icon-link"></span></a>Leading EDJE</h2>
<p>I want to give a special thank you to my employer, <a href="https://leadingedje.com">Leading EDJE</a>, for sponsoring my attendance at CodeMash 2026. Leading EDJE is a fantastic place to work, and I am grateful for their support in allowing me to attend and speak at this event. We had a total of 12 employees attending CodeMash this year, 3 of which including myself presented sessions. Although not a direct sponsor of the event, I love seeing Leading EDJE invest in the professional development of their employees and fostering a culture of learning and growth that outputs so many speakers and community contributors.</p>
<h2 id="thank-you"><a aria-hidden="true" tabindex="-1" href="#thank-you"><span class="icon icon-link"></span></a>Thank you</h2>
<p>Thank you again if you attended my presentations or any of the other amazing sessions at CodeMash 2026. Attendees are why we all put so much effort into conferences, as speakers, sponsors, and organizers. Also thank you to the organizers and volunteers who made this event possible. And a special thank you to the sponsors. These are the companies helping to make this event happen and investing in the future Midwestern developer community:</p>
<ul>
<li><a href="https://infernored.com">InfernoRed Technology</a></li>
<li><a href="https://flexjet.com">Flexjet</a></li>
<li><a href="https://aws.amazon.com">AWS</a></li>
<li><a href="https://tuxcare.com">TuxCare</a></li>
<li><a href="https://redhat.com">Red Hat</a></li>
<li><a href="https://umbraco.com">Umbraco</a></li>
<li><a href="https://codelogic.com">CodeLogic</a></li>
<li><a href="https://jumpmind.com">Jumpmind</a></li>
<li><a href="https://textcontrol.com">Text Control</a></li>
<li><a href="https://callibrity.com">Callibrity</a></li>
<li><a href="https://cgi.com">CGI</a></li>
<li><a href="https://nimblepros.com">NimblePros</a></li>
<li><a href="https://improving.com">Improving</a></li>
<li><a href="https://cnwr.com">CNWR IT Consultants</a></li>
<li><a href="https://gitbutler.com">GitButler</a></li>
<li><a href="https://cyberdrain.com">CyberDrain</a></li>
</ul>]]></content:encoded>
            <category>dotnet</category>
            <category>aspire</category>
            <category>ai</category>
            <category>docker</category>
            <category>azure</category>
            <category>events</category>
            <enclosure url="https://victorfrye.com/assets/_blog/codemash-2026/banner.png" length="0" type="image/png"/>
        </item>
        <item>
            <title><![CDATA[2025 Wrapped]]></title>
            <link>https://victorfrye.com/blog/posts/2025-wrapped</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/2025-wrapped</guid>
            <pubDate>Mon, 22 Dec 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[2025 is nearly over, and it's time to reflect on goals met and achievements made throughout the year.]]></description>
            <content:encoded><![CDATA[<p>The year is 2025, for a few more days. At the start of the year I set a plethora of goals, including personal and professional. I've repeatedly been informed I make ambitious plans and aggressive goals, yet I also meet them. Like open-source software, I think there is value in open-sourcing my goals and progress to share the victories, setbacks, and lessons learned. My goal in this is to provide transparency, invoke feedback, and maybe inspire others to set and achieve their own ambitious goals.</p>
<p>I didn't document my goals publicly this year, so first we will review what was set and why. Starting next year, I hope to open-source my goal setting process to provide more visibility and accountability. We'll see how this goes. There is an element of humility and vulnerability in sharing one's goals and progress, but continuous growth requires facing fears and embracing failure.</p>
<p>Let's dive in.</p>
<h2 id="goals"><a aria-hidden="true" tabindex="-1" href="#goals"><span class="icon icon-link"></span></a>Goals</h2>
<p>I started 2025 with the idea of trying a goal framework to help structure and track my progress throughout the year. Setting goals is itself an art-form, and there are plenty of frameworks and methodologies to choose from. I decided to experiment with <em>OKRs</em> (Objectives and Key Results) with an individual focus. The reason for this was creating abstract statements for what I wanted to achieve and then breaking them down into measurable goals. As an individual, fuzzy goals are more natural. They match how we think and operate. You might want to pay down debt, feel healthier, or build a habit. Very rarely are our goals naturally quantifiable. OKRs are typically found in enterprise settings, but they worked well enough for personal use.</p>
<p>My objectives for 2025 were:</p>
<ol>
<li><strong>Knowledge expertise</strong>: To grow and certify cloud-native and DevOps knowledge expertise.</li>
<li><strong>Community participation</strong>: To actively participate and connect with others in the developer community.</li>
<li><strong>Community contributor</strong>: To become a known active contributor to the developer community.</li>
<li><strong>Happy and healthy family</strong>: To build a solid foundation for a happy and healthy family.</li>
</ol>
<p>Some quick reasonings behind each and why they were important to me:</p>
<ul>
<li><strong>Knowledge expertise</strong>: I found my specialty in cloud-native DevOps and wanted to really focus on deepening this specialty. I find certifications to be a tangible way to validate my knowledge, a quantifiable target to aim for, and my employer, Leading EDJE, reimburses for certification exams, making it a win-win situation. I also enjoy reading and learning new concepts, so this aligned with previously unspoken goals for continuous learning.</li>
<li><strong>Community participation</strong>: I had recently discovered local user groups and a conference speaking circuit where I was meeting new friends. As a remote worker, my social circle is limited, but quickly learned these communities fill that gap in a way that traditionally bores my wife to tears when I try to explain my excitement about Docker or new features in Azure.</li>
<li><strong>Community contributor</strong>: Distinct from participation, I started 2025 after my first two conference talks and realized I love sharing knowledge in this form. I decided to double down on not only participating but contributing through conference speaking as well as exploring blogging.</li>
<li><strong>Happy and healthy family</strong>: I started this year at the end of my 20s celebrating my first anniversary to my beautiful wife. Moving from newlywed status to permanent partnership meant thinking about a foundation for multiple years ahead. Additionally, to build that foundation meant building new habits for myself.</li>
</ul>
<p>Now we review the key results and progress made towards each objective.</p>
<h3 id="knowledge-expertise"><a aria-hidden="true" tabindex="-1" href="#knowledge-expertise"><span class="icon icon-link"></span></a>Knowledge expertise</h3>
<blockquote>
<p>Grow and certify cloud-native and DevOps knowledge expertise.</p>
</blockquote>
<table>
<thead>
<tr>
<th>Key result</th>
<th>Priority</th>
<th>Progress</th>
</tr>
</thead>
<tbody>
<tr>
<td>Achieve recognition as a DevOps specialist at Leading EDJE</td>
<td>High</td>
<td>Completed</td>
</tr>
<tr>
<td>Achieve Azure Data Fundamentals certification from Microsoft</td>
<td>Low</td>
<td>Completed</td>
</tr>
<tr>
<td>Achieve DevOps Engineer Expert certification from Microsoft</td>
<td>High</td>
<td>Off track</td>
</tr>
<tr>
<td>Read four books on cloud-native or DevOps topics</td>
<td>Medium</td>
<td>Off track</td>
</tr>
<tr>
<td>Achieve Azure Developer Associate certification from Microsoft</td>
<td>Medium</td>
<td>Off track</td>
</tr>
</tbody>
</table>
<p>For knowledge expertise, my primary key results were in internal recognitions at Leading EDJE, Microsoft certifications, and reading books. Overall, I missed a lot of these key results and still find myself calling the objective a success for the year. Setting goals like this is about stretching myself and pursuing growth, not just checking off boxes. Of the key results set, I only completed two out of five. More than that, I was completely off track on two (reading and developer certification). Early in the year I realized the DevOps Engineer certification was more immediately important to me than the Developer Associate, so I shifted my focus. Additionally, I enjoy reading books for professional development but this year never saw the drive towards this kind of reading. I trusted myself and de-prioritized this.</p>
<p>That leaves me with the missed DevOps certification. I started studying in the summer, but life repeatedly disrupted my plans. I was exam ready around Thanksgiving and scheduled my exam, but unfortunately I didn't pass. I missed by 27 points! While really frustrating, I plan to retake the exam in January. Wish me luck.</p>
<h3 id="community-participation"><a aria-hidden="true" tabindex="-1" href="#community-participation"><span class="icon icon-link"></span></a>Community participation</h3>
<blockquote>
<p>Be an active participant and connect with others in the developer community.</p>
</blockquote>
<table>
<thead>
<tr>
<th>Key result</th>
<th>Priority</th>
<th>Progress</th>
</tr>
</thead>
<tbody>
<tr>
<td>Meet 1:1 with six local developers</td>
<td>Medium</td>
<td>Completed</td>
</tr>
<tr>
<td>Attend 10 community sessions on cloud-native or DevOps topics</td>
<td>Medium</td>
<td>Completed</td>
</tr>
<tr>
<td>Become an organizer of a local user group</td>
<td>Low</td>
<td>Completed</td>
</tr>
</tbody>
</table>
<p>Considering where I started this year, I blew my goals out of the water in this objective. I had only attended two user groups before this year and three developer conferences in my lifetime. My first key result here is founded on a personal passion: conversation over food or drinks with others. There is true bonding that happens over a meal or coffee. I ended up building a new mentorship relationship here and get coffee with them regularly. I also met new people and learned about their lives over coffee. This is something I will continue to prioritize in the coming year. I want to do more of this and repeatedly engage with certain people. Something something making friends and something?</p>
<p>My other two key results were about conferences and user groups, specifically showing up and becoming invested. I wanted to go from my preexisting five sessions to over double that in a single year. I succeeded with attending eight public user group sessions, four conferences, and a plethora of internal guild meetings at Leading EDJE. Being engaged in each of these helped me find those connections to get coffee with and helped me build a network of friends at my company and in the midwest developer community. My last key result was becoming an organizer of a local user group, which until very recently was off track. I initially planned to launch my own but instead found connections with Beer City Code conference and the local West Michigan Azure User Group, both of which I will be taking a more active role in helping organize in the future. I really didn't understand what all was involved with organizing these kinds of events at the start, but I'm learning. Cheers to seeing how I can help grow these communities in 2026!</p>
<h3 id="community-contributor"><a aria-hidden="true" tabindex="-1" href="#community-contributor"><span class="icon icon-link"></span></a>Community contributor</h3>
<blockquote>
<p>Become an active contributor in the developer community.</p>
</blockquote>
<table>
<thead>
<tr>
<th>Key result</th>
<th>Priority</th>
<th>Progress</th>
</tr>
</thead>
<tbody>
<tr>
<td>Speak at three conferences</td>
<td>High</td>
<td>Completed</td>
</tr>
<tr>
<td>Present three guild talks at Leading EDJE</td>
<td>High</td>
<td>Completed</td>
</tr>
<tr>
<td>Write six blog posts</td>
<td>Low</td>
<td>Completed</td>
</tr>
<tr>
<td>Present one talk at a local user group</td>
<td>Medium</td>
<td>Completed</td>
</tr>
</tbody>
</table>
<p>Here was another overachievement for the year. I was new to the speaking circuit at the start of the year. One oddity is I started speaking in the Ohio developer community before I had ever spoken locally in West Michigan. Part of this was due to my work at Leading EDJE, based in Columbus. I wanted to outdo myself and speak at at least one more conference in 2025 than 2024. I likewise wanted to present as many times internally at guilds (what we call internal sessions at Leading EDJE, often presented over lunch) as possible. But more than anything, I wanted to start speaking in Michigan and closer to home. I succeeded. I made a lot of friends in the SoftwareGR and local Azure communities. I also presented two sessions at my hometown conference of <a href="https://beercitycode.com">Beer City Code</a>. And perhaps most understated was the launch of this blog. This have provided a platform to reference back to and connect with my local and online community. I started with a goal of six blog posts and actually expanded it to twelve in June. I completed that goal as well. Speaking at conferences and local user groups has been incredibly rewarding, and I look forward to continuing to do so in 2026. My first conference will be at <a href="https://codemash.org">CodeMash</a> in January.</p>
<h3 id="happy-and-healthy-family"><a aria-hidden="true" tabindex="-1" href="#happy-and-healthy-family"><span class="icon icon-link"></span></a>Happy and healthy family</h3>
<blockquote>
<p>Build a solid foundation for a happy and healthy family.</p>
</blockquote>
<table>
<thead>
<tr>
<th>Key result</th>
<th>Priority</th>
<th>Progress</th>
</tr>
</thead>
<tbody>
<tr>
<td>Become a homeowner</td>
<td>High</td>
<td>Completed</td>
</tr>
<tr>
<td>Run a 5k</td>
<td>High</td>
<td>Completed</td>
</tr>
</tbody>
</table>
<p>Okay, I didn't define these key results at the start of the year. I started 2025 too focused on professional life and neglected to set goals related to my other hobbies and my family. However, I quickly rectified this informally and throughout the year accomplished a number of personal milestones. I want to do better in this regard next year. This is an area I'll probably avoid going into too much detail on a public site for respect of my family's privacy. Two huge milestones that I can share were becoming a homeowner and running a 5k.</p>
<p>At the start of this year, I had never run a full mile. By late summer, I was able to run 12 miles. I injured myself training for a half marathon, and since then have fallen off a bit. I plan to get back into it very soon, just like everyone says they will, starting off with the new year. I'd start today, but December has proven to be a rough month. I finished three races this year, I aim to complete at least as many again in the new year.</p>
<p>More importantly, I own a real asset now: a home. My wife and I started looking in the early autumn season and found a place that we absolutely love. I didn't realize how much of my time and life would be monopolized by the home buying process, but it happened very quickly and I type this from my new home office. This is the first time in over a decade I think of any physical location as a home I might come back to for years to come. I end 2025 feeling grateful for this small ounce of stability.</p>
<h2 id="overall-reflections"><a aria-hidden="true" tabindex="-1" href="#overall-reflections"><span class="icon icon-link"></span></a>Overall reflections</h2>
<p>I accomplished a lot in 2025. I plan to do the same in 2026. This has been a year of growth, both professionally and personally, and I am grateful for the experiences and milestones I achieved. I also experienced a ton of setbacks and challenges. Family losses, injuries, professional obstacles, and simple failings like rejection letters and failed tests. It's hard to highlight everything even in the lens of goals set and met, but this year was another step forward.</p>
<h2 id="looking-ahead-to-2026"><a aria-hidden="true" tabindex="-1" href="#looking-ahead-to-2026"><span class="icon icon-link"></span></a>Looking Ahead to 2026</h2>
<p>I am still defining what I want to achieve for 2026. I will be attempting to include more personal goals alongside career development goals. 2025 was a year of challenge: challenging myself and resiliency in the face of external challenges. I am currently hoping 2026 will be a year of normalizing. I want to focus on maintaining the progress I made this year and turning these new achievements into lasting habits. I am a conference speaker now, so let's return as a veteran that can mentor and support others. I author a blog, so let's build a stronger presence and consistency in content. I own a home, so let's learn to maintain it properly and make it place of comfort. I want to run new personal bests and play video games on the couch.</p>
<p>2026 could be a year of balance and mindfulness. Or at least, that's the goal.</p>]]></content:encoded>
            <category>career</category>
            <category>events</category>
            <category>goals</category>
            <enclosure url="https://victorfrye.com/assets/_blog/2025-wrapped/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Gifting SDKs with Kiota]]></title>
            <link>https://victorfrye.com/blog/posts/gifting-sdks-with-kiota</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/gifting-sdks-with-kiota</guid>
            <pubDate>Fri, 05 Dec 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[Ho ho ho! In this festive season, let's explore how Kiota can help you generate SDKs for your ASP.NET Core web APIs, making it easier to share your services with others.]]></description>
            <content:encoded><![CDATA[<p>Merry Christmas and happy holidays! It's time for a <a href="https://csadvent.christmas/">C# Advent</a> blog post.</p>
<p>'Tis the season to be jolly, and what better way to spread joy than by gifting SDKs for your ASP.NET Core web APIs? In this festive post, we'll explore how Kiota, a code generator for OpenAPI described APIs, can help you create C# SDKs for your services, making it easier for others to integrate and use them.</p>
<p>Some assumptions and clarifications before we begin:</p>
<ul>
<li>You already have an OpenAPI document for your application. It can be a file saved locally or served from a URL endpoint.</li>
<li>In this post, we'll focus on generating C# SDKs using Kiota, but Kiota also supports other languages like TypeScript, Java, and Python.</li>
<li>Kiota can generate SDKs for any OpenAPI described API, not just ASP.NET Core web APIs. However, for this post, we'll specifically look at ASP.NET Core. We're C# developers after all!</li>
</ul>
<p>In this blog post, we'll utilize an existing OpenAPI document and generate a C# SDK using Kiota.</p>
<h2 id="what-is-kiota"><a aria-hidden="true" tabindex="-1" href="#what-is-kiota"><span class="icon icon-link"></span></a>What is Kiota?</h2>
<p>Every good gift starts with understanding what you're giving. One of the major innovations in .NET is the <a href="https://github.com/microsoft/OpenAPI.NET">OpenAPI.NET</a> library, which moves OpenAPI support to first-class status within the .NET platform. <a href="https://github.com/microsoft/kiota">Kiota</a> is a Microsoft code generator that leverages this library to create SDKs for OpenAPI described APIs. It supports multiple programming languages, including C#, TypeScript, Java, and Python. There's plenty of other code generators out there, but this tight integration with OpenAPI.NET and Microsoft support makes it a compelling choice.</p>
<p>I discovered Kiota while exploring ways my team could generate SDKs for a plethora of new microservices. A common problem in microservice architectures is each service exposes its own API and any consumers need to know the (1) the available endpoints, (2) the model definitions, and (3) write the client code to call the service. This is a lot of boilerplate work that is tedious and error-prone. An OpenAPI document helps describe the API surface and models, but a code generator like Kiota can take this a step further by generating the exact client code needed to call the service, including models, request builders, and serialization logic.</p>
<h2 id="installing-kiota"><a aria-hidden="true" tabindex="-1" href="#installing-kiota"><span class="icon icon-link"></span></a>Installing Kiota</h2>
<p>Santa has made it easy for .NET developers to get started with Kiota. It is available as a .NET tool, which means you can install it once globally or add it as a local tool to your project. To install Kiota globally, run the following command:</p>
<pre><code class="language-bash">dotnet tool install --global Microsoft.OpenApi.Kiota
</code></pre>
<p>We can verify the installation by checking the version:</p>
<pre><code class="language-bash">kiota --version
</code></pre>
<h2 id="set-up-an-sdk-project"><a aria-hidden="true" tabindex="-1" href="#set-up-an-sdk-project"><span class="icon icon-link"></span></a>Set up an SDK project</h2>
<p>Before we continue, we need an SDK project. Kiota can generate a client that you include in existing projects, but building an SDK project is a great way to package and distribute the code via NuGet. Your customers don't care how you built the SDK; they just want the gift of ready-to-use code.</p>
<p>Let's create a new class library project for our SDK:</p>
<pre><code class="language-bash">dotnet new classlib --name Sdk --output ./src/Sdk
</code></pre>
<p>Next, we need to add some dependencies to our SDK project. Kiota-generated clients rely on a few NuGet packages for abstractions, serialization, and HTTP handling. We can add these packages using the <code>Microsoft.Kiota.Bundle</code> package reference. Run the following command in the SDK project directory:</p>
<pre><code class="language-bash">dotnet package add Microsoft.Kiota.Bundle --project ./src/Sdk/Sdk.csproj
</code></pre>
<p>Let's now clear any existing code such as the default <code>Class1.cs</code> file created by the class library template. Kiota will create everything we need in our SDK project.</p>
<h2 id="generating-the-client-with-kiota"><a aria-hidden="true" tabindex="-1" href="#generating-the-client-with-kiota"><span class="icon icon-link"></span></a>Generating the client with Kiota</h2>
<p>Now that we have our SDK project set up, it's time to generate the client code. I am going to use an OpenAPI document hosted locally for demonstration purposes, but you can use any OpenAPI document accessible via a URL or file path. Like a good Christmas wishlist, Kiota has several options to customize the generated code. We'll use some good defaults for this example and explain why they matter:</p>
<pre><code class="language-bash">kiota generate \
    --language csharp \
    --openapi https://localhost:7045/openapi/v1.json \
    --output ./src/Sdk \
    --class-name ChristmasApiClient \
    --namespace-name VictorFrye.MerryChristmas.Sdk
</code></pre>
<p>Here's a breakdown of the options used:</p>
<ul>
<li><code>--language csharp</code>: Specifies that we want to generate a C# client. The language option is crucial given Kiota's multi-language support.</li>
<li><code>--openapi</code>: The path or URL to the OpenAPI document describing the API. This is the heart of the generation process.</li>
<li><code>--output</code>: The directory where the generated code will be placed. We point this to our SDK project but it could be a directory in an existing application.</li>
<li><code>--class-name</code>: The name of the main client class to generate. The default is <code>ApiClient</code>, but giving it a meaningful name like <code>ChristmasApiClient</code> makes it clear what service this client interacts with and avoids naming conflicts.</li>
<li><code>--namespace-name</code>: The namespace for the generated code. This helps organize the code and should align with your project's namespace conventions.</li>
</ul>
<p>After running the command, Kiota will generate models, api request builders, and our api client class in the specified output directory. You will also see a <code>kiota-lock.json</code> file that captures the generation settings for future reference.</p>
<p>We now have our gift-wrapped SDK ready to be shared!</p>
<h2 id="using-the-generated-sdk"><a aria-hidden="true" tabindex="-1" href="#using-the-generated-sdk"><span class="icon icon-link"></span></a>Using the generated SDK</h2>
<p>With our SDK generated, it's time to see how we can use it in an application. Let's create a simple console application that utilizes the <code>ChristmasApiClient</code> to interact with our ASP.NET Core web API. We'll use a C# file-based application for simplicity:</p>
<pre><code class="language-csharp">#!/usr/bin/env dotnet

#:project ../src/Sdk/Sdk.csproj
#:package Microsoft.Kiota.Bundle@1.21.0

using Microsoft.Kiota.Http.HttpClientLibrary;
using Microsoft.Kiota.Abstractions.Authentication;
using VictorFrye.MerryChristmas.Sdk;

var baseUrl = args.Length > 0 ? args[0] : "https://localhost:7045";

var client = new ChristmasApiClient(new HttpClientRequestAdapter(new AnonymousAuthenticationProvider())
{
    BaseUrl = baseUrl
});

Console.WriteLine(await client.Api.Christmas.GetAsync());

</code></pre>
<p>This demo is very lightweight, but gives a sneak peek at our gifted SDK in action. We create an instance of the <code>ChristmasApiClient</code>, configure it with a base URL, and make a call to the <code>GetAsync</code> method on the <code>Christmas</code> endpoint. The generated client handles all the HTTP communication, serialization, and deserialization for us.</p>
<p>The biggest trick of any generated code is understanding how to use it. The real gift: we don't have to write any of the boilerplate code ourselves. The generated SDK does all the heavy lifting, allowing us to focus on building out and consuming our APIs.</p>
<h2 id="final-thoughts"><a aria-hidden="true" tabindex="-1" href="#final-thoughts"><span class="icon icon-link"></span></a>Final thoughts</h2>
<p>Code generation is a powerful tool that can save developers significant time and effort. Kiota made it easy to generate SDKs for OpenAPI described APIs, and given its tight integration with the .NET ecosystem, it was a natural choice for my microservices project. This example walks through a C# SDK, but multi-language support means you can gift SDKs for your frontend developers in TypeScript or a data science team in Python.</p>
<p>If you need additional reference material, check out the official <a href="https://learn.microsoft.com/en-us/openapi/kiota/">Kiota documentation</a> for more details on languages, options, and advanced usage scenarios. You can also explore my <a href="https://github.com/victorfrye/secretsanta">Secret Santa repository</a> which contains the sample ASP.NET Core web API and a Kiota-generated SDK and samples.</p>
<p>Happy holidays and this festive season, give the gift or a ready-to-use SDK with Kiota! Ho ho ho!</p>]]></content:encoded>
            <category>dotnet</category>
            <category>openapi</category>
            <enclosure url="https://victorfrye.com/assets/_blog/gifting-sdks-with-kiota/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Cozy Aspire Dashboarding]]></title>
            <link>https://victorfrye.com/blog/posts/cozy-aspire-dashboarding</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/cozy-aspire-dashboarding</guid>
            <pubDate>Tue, 28 Oct 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[The Aspire dashboard is your home for local development. Let's explore customizing it with friendly URLs and icons for a cozier experience.]]></description>
            <content:encoded><![CDATA[<p>The Aspire dashboard is your home for local development when building with Aspire. It provides a centralized view of all your modeled services, their health, endpoints, and logs. By default, the dashboard provides a functional experience. But we can make it cozier with some out-of-the-box customizations. In this post, I'll walk through how to customize both the endpoint URLs and the service icons to make your Aspire dashboard feel a bit more like home.</p>
<h2 id="the-default-aspire-dashboard"><a aria-hidden="true" tabindex="-1" href="#the-default-aspire-dashboard"><span class="icon icon-link"></span></a>The default Aspire dashboard</h2>
<p>When you start up your app host with Aspire, you are greeted with the Aspire dashboard. This provides a wealth of information, but upfront is the resource table listing all your modeled services. The number of resources will vary based on your application, but for this post, we will use only one: a static web client. The methods for customizing the dashboard remain the same regardless of the number of services.</p>
<p>By default, the Aspire dashboard uses icons based on resource type and displays URLs based on the endpoints you configured. Given the following simple app host program.cs:</p>
<pre><code class="language-csharp">var builder = DistributedApplication.CreateBuilder(args);

builder.AddNpmApp("client", "../WebClient", "dev")
       .WithHttpEndpoint(env: "PORT")
       .WithExternalHttpEndpoints();

await builder.Build().RunAsync();

</code></pre>
<p>This will produce a resource table similar to the following:</p>
<p><img src="/assets/_blog/cozy-aspire-dashboarding/default_dashboard.jpg" alt="Default Aspire Dashboard"></p>
<p>Here, we see our table row for the <code>client</code> service. The icon is a generic executable icon since we are using the NPM app resource type. The URL is plain and literal for the HTTP endpoint we configured. This is far from cozy! Let's fix that.</p>
<h2 id="custom-resource-icons-using-withiconname"><a aria-hidden="true" tabindex="-1" href="#custom-resource-icons-using-withiconname"><span class="icon icon-link"></span></a>Custom resource icons using WithIconName</h2>
<p>Under the hood, the Aspire dashboard uses Fluent UI design system by Microsoft. With the addition of a new method on the resource builders, we can specify any existing Fluent UI icon to use for our resource. That new method is the <code>WithIconName()</code> method. This method takes an <code>iconName</code> parameter that corresponds to the name of the Fluent icon to use. For a list of all available icons, check out the <a href="https://storybooks.fluentui.dev/react/?path=/docs/icons-catalog--docs">Fluent UI Icons catalog</a>.</p>
<p><img src="/assets/_blog/cozy-aspire-dashboarding/icon_catalog.jpg" alt="Fluent UI icons catalog with globe search"></p>
<p>In my case, this is my personal website and I always think of globes when I think of websites. So, I decide to search for "globe" in the catalog and find a plethora of system icons related to globes. I am fairly familiar with this catalog so I already know what I am looking for, but if you are new to Fluent UI, take your time browsing the catalog to find an icon that resonates with you. Icons are commonly named in the format: <code>&#x3C;iconName>&#x3C;iconVariant></code> with icon names being literal metaphors for the shape and not the functionality. This means you may have to try a few different names to find out that perfect icon. The <code>iconVariant</code> is the suffix for all icons and can be <code>Regular</code>, <code>Filled</code>, and <code>Color</code>. The regular and filled options can be provided as a second enumerated parameter to the <code>WithIconName()</code> method. The color variant is not currently supported, though may be an option in the future. By default, the filled variant is used if none is specified.</p>
<p>So, I decide to use the <code>Globe</code> icon and keep the default filled variant. I update my app host code as follows:</p>
<pre><code class="language-csharp">var builder = DistributedApplication.CreateBuilder(args);

builder.AddNpmApp("client", "../WebClient", "dev")
       .WithIconName("Globe")
       .WithHttpEndpoint(env: "PORT")
       .WithExternalHttpEndpoints();

await builder.Build().RunAsync();

</code></pre>
<p>This produces the following updated dashboard:</p>
<p><img src="/assets/_blog/cozy-aspire-dashboarding/with_icon_table.jpg" alt="Aspire Dashboard with custom icon"></p>
<p>Now, our client service has a globe icon! This is already feeling cozier. It also has functional benefits as I can quickly identify my web client service in the dashboard. It's less important with only one resource, but if you have multiple services with the same resource type, you will appreciate the visual distinction.</p>
<p>It also is much more distinguishable when looking at the graph view of resources:</p>
<p><img src="/assets/_blog/cozy-aspire-dashboarding/with_icon_graph.jpg" alt="Aspire Dashboard graph view"></p>
<p>The graph view is especially useful when checking out our dependency relationships of all modeled services. Our sample application is simple, but in a more complex application, this view becomes invaluable. The custom icons help make it even clearer.</p>
<h2 id="custom-urls-using-withurlforendpoint"><a aria-hidden="true" tabindex="-1" href="#custom-urls-using-withurlforendpoint"><span class="icon icon-link"></span></a>Custom URLs using WithUrlForEndpoint</h2>
<p>When you add endpoints to your modeled services, or even by default the endpoints exposed, Aspire will render the literal URLs in the dashboard. For an ASP.NET Core Web API, you might see an <code>http</code> URL and an <code>https</code> URL. For our NPM app, we see just the <code>http</code> URL that our npm command is serving the dev server on. I'm using dynamic ports, so the port number is not predictable and the URL quickly shows me the port. It's also a link, so I can quickly click it to open the client in my browser. This also means I never care about the actual port number. I just want to open my client. I also have three major pages in my web client: home at <code>/</code>, my resume at <code>/resume</code>, and a blog you may be reading at <code>/blog</code>. It would be really nice if I could quickly open to any of these pages with short hyperlinks. Fortunately, Aspire provides a way to do this with the <code>WithUrlForEndpoint()</code> method.</p>
<p>Aspire has two different constructs at play here: URLs and endpoints. An endpoint is named and may be configured by the resource type itself. Each endpoint may have one or more URLs associated with it. By default, Aspire will create URLs based on the actual listening addresses of the endpoints. However, with the <code>WithUrlForEndpoint()</code> method, we can add additional URLs for any endpoint. This method takes two parameters: the <code>endpointName</code> and a <code>callback</code> to configure the URL. The <code>endpointName</code> is the name of the endpoint to add the URL for. In our case, we only have one HTTP endpoint, so we can use the default name of <code>http</code>. The <code>callback</code> provides the <code>EndpointReference</code> that we can use to configure the URL or ignore and add more URLs. Let's add three URLs for our three main pages in the web client. We can update our app host code as follows:</p>
<pre><code class="language-csharp">var builder = DistributedApplication.CreateBuilder(args);

builder.AddNpmApp("client", "../WebClient", "dev")
       .WithIconName("Globe")
       .WithHttpEndpoint(env: "PORT")
       .WithUrlForEndpoint("http", static url => url.DisplayText = "🏠 Home")
       .WithUrlForEndpoint("http", static _ => new()
        {
            Url = "/resume",
            DisplayText = "💼 Resume"
        })
        .WithUrlForEndpoint("http", static _ => new()
        {
            Url = "/blog",
            DisplayText = "✏️ Blog"
        })
        .WithExternalHttpEndpoints();

await builder.Build().RunAsync();

</code></pre>
<p>Our URLs are relative and have the property of <code>Url</code> and <code>DisplayText</code>. The <code>Url</code> is the relative URL from the endpoint. The <code>DisplayText</code> is optional and determines what is shown in the dashboard. This allows us to use emojis or friendly names for our URLs. After updating our code, we get the following dashboard:</p>
<p><img src="/assets/_blog/cozy-aspire-dashboarding/cozy_dashboard.jpg" alt="Cozy Aspire Dashboard with custom URLs"></p>
<h2 id="a-cozier-dashboard"><a aria-hidden="true" tabindex="-1" href="#a-cozier-dashboard"><span class="icon icon-link"></span></a>A cozier dashboard</h2>
<p>Now, our Aspire dashboard feels a bit more like home. We have a friendly globe icon for our web client service and three quick links to our main pages with emojis to make them stand out. This makes <a href="/blog/posts/local-friendly-aspire-modeling">local development with Aspire</a> even more enjoyable. It also serves to make it easier to work with my application through visual cues and important links. I recommend adding these customizations to your aspirified applications to help your team develop faster and with more joy. Happy aspirifying!</p>]]></content:encoded>
            <category>aspire</category>
            <category>cloudnative</category>
            <category>dotnet</category>
            <enclosure url="https://victorfrye.com/assets/_blog/cozy-aspire-dashboarding/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Six Months Changes .NET STS]]></title>
            <link>https://victorfrye.com/blog/posts/six-months-changes-dotnet-sts</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/six-months-changes-dotnet-sts</guid>
            <pubDate>Thu, 23 Oct 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[Microsoft has extended the short-term support (STS) policy for .NET, providing 24 months of support for each release. Let's dive into what this means for enterprise developers planning their upgrade and support strategy.]]></description>
            <content:encoded><![CDATA[<p>Microsoft has announced an update to the short-term support (STS) policy for .NET releases, <a href="https://devblogs.microsoft.com/dotnet/dotnet-sts-releases-supported-for-24-months/">extending the length of support from 18 months to 24 months</a>. This goes into effect with the already released .NET 9. Not enough people are talking about this, but it significantly changes the conversation around which .NET versions to use in production.</p>
<p>This post will break down what this means for developers and organizations planning their .NET upgrade and support strategy. Let's dive in.</p>
<h2 id="standard-term-long-term"><a aria-hidden="true" tabindex="-1" href="#standard-term-long-term"><span class="icon icon-link"></span></a>Standard term, long term</h2>
<p>.NET has been using a short-term, long-term support model for .NET releases. The short-term support (STS) releases included odd-number releases while long-term support (LTS) releases were even-numbered. The below covers the currently supported and anticipated .NET versions:</p>
<table>
<thead>
<tr>
<th>Version</th>
<th>Support Type</th>
</tr>
</thead>
<tbody>
<tr>
<td>.NET 8</td>
<td>LTS</td>
</tr>
<tr>
<td>.NET 9</td>
<td>STS</td>
</tr>
<tr>
<td>.NET 10</td>
<td>LTS (Nov 2025)</td>
</tr>
<tr>
<td>.NET 11</td>
<td>STS (Nov 2026)</td>
</tr>
</tbody>
</table>
<p>All together, this is not a lot of versions. The bi-annual release cadence of LTS releases and off-year STS releases leaves us with a simple lineup of supported versions. There is additional complexity in out-of-band components, but the majority of conversation around .NET support focuses on the major versions annually released each November.</p>
<p>When it comes to supporting languages, frameworks, and runtimes, the conversation always forms around what the upgrade cadence looks like to stay in support and stay stable. With .NET, the major versions have a clear release window, have predictable support through Microsoft, and (despite some legacy concerns) have proven to be stable for production enterprise usage at launch regardless of support duration. Each release includes shiny new features and performance improvements that make upgrading to the latest version desirable, but it takes time and effort. This means the upgrading is the primary tax when choosing whether to utilize .NET STS releases. This also has been where STS consistently fails. But that is changing with the new 24-month support window for STS releases.</p>
<h2 id="upgrade-tax"><a aria-hidden="true" tabindex="-1" href="#upgrade-tax"><span class="icon icon-link"></span></a>Upgrade tax</h2>
<p>Developers love to use the latest and greatest technology, including new versions. However, enterprises are limited by their ability to spend the time needed to upgrade. When applications transition from growth or active work to maintenance mode, the ability to upgrade decreases significantly. The technology landscape means this transition from active to legacy can happen within months or even weeks. We still need to upgrade to maintain our security posture, apply bug fixes, and keep up with compliance requirements. However, upgrading is a tax that enterprises pay. This tax is paid in time, resources, and potential disruption to the business. The longer the support window, the more time enterprises have to plan and execute upgrades, reducing the overall tax.</p>
<p>In many languages, the upgrade tax is only worth it when using LTS versions because the STS support window was too short to justify the risk of being unable to upgrade in a timely fashion. The .NET STS releases suffered from this problem with 18-month support windows. To understand this better, let's review what this would look like when choosing between .NET STS and LTS releases of what is current:</p>
<h3 id="18-month-sts-scenario"><a aria-hidden="true" tabindex="-1" href="#18-month-sts-scenario"><span class="icon icon-link"></span></a>18-month STS scenario</h3>
<p>A year ago, November 2024, .NET 9 was to be released and the decision to be made: do we stay on .NET 8 LTS version or upgrade to .NET 9 STS version? We have the time and momentum to upgrade to .NET 9, but the upgrade tax needs to include consideration for the following upgrade. Staying on .NET 8, support would end in November 2026, at which time .NET 10 would have a year of support left and .NET 11 would be freshly launched. However, if we decided to upgrade to .NET 9, we would have to upgrade again in 18 months, which would be May 2026 and the actively supported versions would be .NET 8 and .NET 10. STS releases like .NET 9 expire before their previous LTS version support has ended.</p>
<p>Enterprise upgrades for apps in support mode are commonly done at the last minute as deadline pressures mount. The pressures and time complexities of an upgrade then weigh on teams working in growing or mature applications like this scenario. Staying on .NET 8, you can choose to wait for the last minute and skip to the newly released .NET 11 or stay on LTS and only upgrade to 10. The work to upgrade might be more, but it may be worth it. Even if your application moves into support mode or feature work is too demanding, the tax of upgrading is manageable. If you upgrade to .NET 9, when the deadline presents itself to upgrade, you have to move to .NET 10. You cannot skip a version or you risk being out of support. This makes the choice of moving off a LTS version onto an STS a commitment that this app will stay on the upgrade treadmill and pay the upgrade tax every year.</p>
<h2 id="support-change"><a aria-hidden="true" tabindex="-1" href="#support-change"><span class="icon icon-link"></span></a>Support change</h2>
<p>The .NET team has recognized issues with the 18-month support window for STS releases and extended support going forward to 24 months. This takes effect with .NET 9, moving the end of support from May 2026 to November 2026. This means that support for STS releases will now end at the same time as the previous LTS release, aligning the end-of-support dates for both STS and LTS versions.</p>
<p><img src="/assets/_blog/six-months-changes-dotnet-sts/release_lifecycle.jpg" alt="New .NET support lifecycle for releases (Source: Microsoft)">
<em>Image: Microsoft</em></p>
<p>This change in support solves a lot of problems, including with our scenario above:</p>
<ul>
<li>LTS releases no longer outlive their following STS releases</li>
<li>STS releases now end support at the same time as a new release</li>
</ul>
<p>Given these changes, let's examine the impact on our upgrade tax scenario.</p>
<h3 id="24-month-sts-scenario"><a aria-hidden="true" tabindex="-1" href="#24-month-sts-scenario"><span class="icon icon-link"></span></a>24-month STS scenario</h3>
<p>We are going to rewrite history and assume we knew about this 24-month support. We have the choice to upgrade from an LTS version to STS with .NET 8 to .NET 9. Now, future considerations change. The year afterwards, we may have the bandwidth to stay on the upgrade treadmill and move to .NET 10. However, enterprise priorities shift and we choose not to. Both .NET 8 and .NET 9 apps are still supported until the following release. If we wait until the last minute, both end their support with the same options: upgrade to the LTS version (.NET 10) or skip ahead and move to .NET 11.</p>
<p>The change in the scenario is subtle, but significant. The choice to move from an LTS version to an STS version does not commit you to a different upgrade schedule than any apps unable to move to STS. Apps in support mode or transitioning to LTS can stay on LTS, but applications with active development can take advantages of STS releases. This means that the choice to move to an STS release is not a commitment to a different upgrade schedule, but rather an opportunity to take advantage of new features and improvements while still being able to upgrade to the next release when ready.</p>
<h2 id="net-10-is-coming"><a aria-hidden="true" tabindex="-1" href="#net-10-is-coming"><span class="icon icon-link"></span></a>.NET 10 is coming</h2>
<p>We are very close to the launch of .NET 10, the next LTS release. The change in STS support won't impact this launch, yet it is the start of a new conversation around upgrading .NET applications. .NET 8 and .NET 9 applications will be upgrading to .NET 10 when ready and taking advantages of new features and performance improvements. What the STS support lifecycle really changes is what comes next: are you going to be stuck with .NET 10 for the next two years or will you be confident in your choice November 2026 to stay on .NET 10 or make the jump to .NET 11?</p>
<h2 id="reflections-and-recommendations"><a aria-hidden="true" tabindex="-1" href="#reflections-and-recommendations"><span class="icon icon-link"></span></a>Reflections and recommendations</h2>
<p>The scenario I covered was a real one. The team had the bandwidth, the management support was there, and the product roadmap was healthy. However, no one was confident where they would be in 18 months and if we would be able to keep paying the upgrade tax. The schedule was too much of a commitment. The move to 24-month support changes the calculus. The hesitation to upgrade to .NET 9 would not have existed if we knew it wouldn't trap us into upgrading to .NET 10 a year later. If we had the 24-month support at the time, we would have upgraded.</p>
<p>Going forward, I will be recommending my projects use the latest version of .NET, regardless of STS or LTS status. The alignment of when LTS and STS support expiration means projects that are actively being developed are not penalized for choosing STS releases. The upgrade tax is most manageable when projects are active and making small updates frequently. LTS releases are still best suited for applications in maintenance mode or with infrequent updates. However, the choice to use STS releases will no longer be a heavy decision weighted down by upgrade tax concerns. This is a win for .NET developers and enterprises looking to stay current with the platform.</p>]]></content:encoded>
            <category>dotnet</category>
            <enclosure url="https://victorfrye.com/assets/_blog/six-months-changes-dotnet-sts/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Momentum 2025]]></title>
            <link>https://victorfrye.com/blog/posts/momentum-2025</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/momentum-2025</guid>
            <pubDate>Mon, 20 Oct 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[Thank you to the Cincy developer community! Thank you to the attendees, organizers, sponsors, and my fellow speakers for making this event a success.]]></description>
            <content:encoded><![CDATA[<p>Thank you to the greater Cincinnati developer community for hosting another Momentum! I was thrilled to return as a consecutive speaker for my second year and capstone my first year speaking at local conferences with the one that started it all for me. I saw so many familiar faces and met many new friends as well. The Cincinnati community continues to impress me and I look forward to potentially visiting again in the future. Cheers!</p>
<p>Two exciting callouts from the event: we had some amazing representation from my employer, <a href="https://leadingedje.com">Leading EDJE</a>, with several team members speaking and attending including <a href="https://www.linkedin.com/in/rfornal/">Bob Fornal</a> and <a href="https://www.linkedin.com/in/ed-legault-37aba4">Ed LeGault</a> who spoke on automated testing and DORA/EBM respectively. Additionally, my local Grand Rapids community showed up with fellow speakers and friends <a href="https://www.linkedin.com/in/brianmckeiver">Brian McKeiver</a> and <a href="https://www.linkedin.com/in/jtower">Jonathan "J." Tower</a> who presented on developer estimates and over-engineering. It was great to see this cross-pollination of communities in Cincinnati!</p>
<p>This blog post is a reference for those who attended my presentation as well as a thank you letter to everyone involved.</p>
<h2 id="cloud-terraformation"><a aria-hidden="true" tabindex="-1" href="#cloud-terraformation"><span class="icon icon-link"></span></a>Cloud Terraformation</h2>
<p><img src="/assets/_blog/momentum-2025/cloud_terraformation.jpg" alt="Cloud Terraformation hero"></p>
<p>My presentation was titled "Cloud Terraformation" and focused on infrastructure as code using Terraform to manage Azure resources. The session covered the ClickOps problem, an overview of Terraform as a choice for infrastructure as code, and a live demo of setting up Terraform with Azure, exporting existing resources, and managing them through code. Unfortunately, we had some technical difficulties with the live demo so no one got to see the final result, but it did work! Thank you for bearing with me through that AV hiccup. For those interested in learning more or checking out the final state, you can continue to read more on my blog or check out the resources below. Some helpful links and resources include:</p>
<ul>
<li><a href="https://github.com/victorfrye/presentations/blob/main/files/2025/momentum/cloudterraformation.pdf">Presentation slide deck</a></li>
<li><a href="https://github.com/victorfrye/shrugman">Shrug Man repository</a></li>
<li><a href="https://shrugman.com">Shrug Man website</a></li>
<li><a href="https://developer.hashicorp.com/terraform">Terraform documentation</a></li>
<li><a href="https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs">AzureRM provider documentation</a></li>
</ul>
<h2 id="thank-you"><a aria-hidden="true" tabindex="-1" href="#thank-you"><span class="icon icon-link"></span></a>Thank you</h2>
<p>Thank you again if you attended my presentation or any of the other great sessions at Momentum 2025. Attendees represent the passion for events like Momentum and you showing up makes these events worthwhile for speakers, organizers, and sponsors alike. Thank you to the organizers, especially <a href="https://www.linkedin.com/in/accidentaldeveloper">Michael Richardson</a> and <a href="https://www.linkedin.com/in/matteland">Matt Eland</a> for their tireless work of making this event happen and putting together an excellent lineup of speakers. Organizers like them bring the dream and vision of these events to life. Finally, thank you to the sponsors who continue to invest in the developer communities and make conferences like Momentum possible. These are the companies helping to make Momentum 2025 a reality and investing in the future Cincinnati developer community:</p>
<ul>
<li><a href="https://thecircuit.net/">The Circuit</a></li>
<li><a href="https://53.com">Fifth Third</a></li>
<li><a href="https://eliassen.com/">Eliassen Group</a></li>
<li><a href="https://westernsouthern.com/">Western &#x26; Southern</a></li>
<li><a href="https://afidence.com/">Afidence</a></li>
<li><a href="https://ingagepartners.com/">Ingage</a></li>
<li><a href="https://helloencore.com/">Encore Talent</a></li>
<li><a href="https://maxtrain.com/">MAX Technical Training</a></li>
<li><a href="https://cgi.com/">CGI</a></li>
<li><a href="https://agilitypartners.io/">Agility Partners</a></li>
<li><a href="https://cincinnatizoo.org/">Cincinnati Zoo</a></li>
<li><a href="https://americansignmuseum.org/">American Sign Museum</a></li>
<li><a href="https://breadkrumb.com/">Breadkrumb</a></li>
<li><a href="https://woodburngames.com/">Woodburn Games</a></li>
<li><a href="https://tql.com/">TQL</a></li>
</ul>]]></content:encoded>
            <category>terraform</category>
            <category>azure</category>
            <category>events</category>
            <enclosure url="https://victorfrye.com/assets/_blog/momentum-2025/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[From Dev to DevOps]]></title>
            <link>https://victorfrye.com/blog/posts/from-dev-to-devops</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/from-dev-to-devops</guid>
            <pubDate>Mon, 29 Sep 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[Learning DevOps is not limited by technology skills. Here, we discuss the tooling and mindset that can help developers transition to DevOps engineers.]]></description>
            <content:encoded><![CDATA[<p>DevOps is more than a role; it's a culture and mindset that bridges the gap between development and operations. Any member of an IT organization or software company can embrace DevOps principles to improve collaboration, streamline processes, and enhance software delivery. Any person can carry more than one role. However, the literature for DevOps often starts with operations: system administrators, infrastructure engineers, and site reliability engineers (SREs). One of the best books on the topic, <a href="https://itrevolution.com/product/the-phoenix-project/">The Phoenix Project</a>, is written from the perspective of an operations manager (and I highly recommend reading it). DevOps is about operations, but it is also about development. In truth, DevOps is about the entire software lifecycle and thus any person involved in it can learn and grow into a DevOps role. One such path is from developer to a DevOps engineer.</p>
<h2 id="the-common-guidance"><a aria-hidden="true" tabindex="-1" href="#the-common-guidance"><span class="icon icon-link"></span></a>The common guidance</h2>
<p>The most common guidance for learning DevOps is to start with tooling from the operations perspective with recommendations to start with Linux or containers or Kubernetes. Some may find success this way, but I find it misleading. DevOps is difficult to learn first and these technologies are complex. It also does not matter if your code is executed in a container, virtual machine, or bare-metal on a Windows server to practice DevOps. However, a well-informed DevOps engineer knows why containerization is used and why choice of operating system matters. Instead, I recommend starting with what you know and building on that. If you want to learn DevOps, start with the various roles that practice it: developers, testers, operations engineers, or project managers. Here, I am focused on the developer role because that is my background and what I know best.</p>
<h2 id="the-developer-role"><a aria-hidden="true" tabindex="-1" href="#the-developer-role"><span class="icon icon-link"></span></a>The developer role</h2>
<p>A developer is responsible for writing, testing, and maintaining code that forms the basis of software applications. They work with team members in various roles to:</p>
<ul>
<li>Understand requirements of what to build and translate them into functional software.</li>
<li>Write clean, efficient, and quality code that is testable and maintainable.</li>
<li>Ensure the software is buildable, deployable, and operational in installed environments.</li>
<li>Deliver software that meets user needs and business goals in a timely manner.</li>
</ul>
<p>One distinction is that a person may perform more than one role. For example, a developer may also be acting in the role of a manager, a designer, or a network engineer. The role of a developer is focused on developing software, but a person is often responsible for more than just writing code. Commonly, people in the developer role are also responsible for:</p>
<ul>
<li>Troubleshooting business applications and triaging why behavior is not as expected.</li>
<li>Understanding legacy software and how it operates critical business logic.</li>
</ul>
<p>In an enterprise and during a production incident, someone in the role of developer may be called to explain why insurance claims are still pending or an appointment booking failed. At the intersection of operations and development, a developer may be the first to know when a database failure or network outage is causing business disruption. In this way, developers are already acting outside of the limited scope of writing code. This is where DevOps comes in.</p>
<h2 id="the-devops-shift"><a aria-hidden="true" tabindex="-1" href="#the-devops-shift"><span class="icon icon-link"></span></a>The DevOps shift</h2>
<p>DevOps is about the entire software lifecycle and the interrelationships between traditional developer and operations roles. A person in the role of both developer and DevOps engineer is responsible for:</p>
<ul>
<li>Understanding the entire software lifecycle, from planning feature requirements and writing code to deploying and maintaining applications in production.</li>
<li>Developing the solutions that support the software lifecycle, such as CI/CD pipelines, infrastructure in the form of code, and automating tests.</li>
<li>Knowing the difference between code written and value delivered.</li>
</ul>
<p>Being a DevOps engineer may never include a direct title change. However, it may represent a growth in responsibilities commonly required for promotion. A developer who understands how to implement DevOps practices in tooling is one who can understand architecture, processes, and the business value of their application and how to drive change with teams. These are requirements that lead to senior and principal engineer roles.</p>
<h2 id="learning-devops"><a aria-hidden="true" tabindex="-1" href="#learning-devops"><span class="icon icon-link"></span></a>Learning DevOps</h2>
<p><img src="/assets/_blog/from-dev-to-devops/learning_devops.jpg" alt="The DevOps learning path for developers"></p>
<h3 id="build-systems"><a aria-hidden="true" tabindex="-1" href="#build-systems"><span class="icon icon-link"></span></a>Build systems</h3>
<p>As a developer, you are already working with various technologies used for DevOps. The first is your build system. Today, software is built often. You need to build your application locally multiple times to test changes. You may use pipelines to build your application in another environment for verifying changes in a pull request. If you want to move from developer to DevOps engineer, the first place to start is understanding how your code is built and how it is run in all the different environments.</p>
<p>With .NET, this means understanding the differences between the .NET SDK and runtime and the <a href="https://learn.microsoft.com/en-us/dotnet/core/tools/">dotnet CLI</a> used to build, run, and publish code. For JavaScript, this means understanding the differences between development servers, bundling, and how static files are served in browsers. Every language has its own build tools and is different in execution environments. For .NET, the common language runtime (CLR) is used to run code on Windows, Linux, and macOS. For JavaScript, the runtime is the browser or Node.js. Understanding how your code is built and executed is critical to automation and maintenance. When you know this, you can begin to optimize and automate the process.</p>
<h3 id="source-control-concepts"><a aria-hidden="true" tabindex="-1" href="#source-control-concepts"><span class="icon icon-link"></span></a>Source control concepts</h3>
<p>Most developers are already using source control, such as Git, to store and collaborate on code. However, it is an underappreciated tool that is critical to developers and DevOps engineers alike. Source control systems are the foundation of collaboration and change management. GitOps is a practice that uses Git repositories as the source of truth for all kinds of code, including application code, infrastructure as code, configuration files, and CI/CD pipelines. Your branching strategies and pull request processes are key aspects of how you audit and manage change. Git is the tool, but GitOps is the adoption of DevOps practices for automation of operational concerns. Turns out this developer tool is also a DevOps tool.</p>
<h3 id="command-line-and-scripting"><a aria-hidden="true" tabindex="-1" href="#command-line-and-scripting"><span class="icon icon-link"></span></a>Command-line and scripting</h3>
<p>The command-line can be avoided by most developers these days. IDEs and graphical interfaces often abstract away the need to use a command-line interface (CLI). However, CLIs are necessary for DevOps automation. You can know F5 runs your code in the IDE, but when authoring a pipeline you need to know the commands that do this. Sometimes it becomes a series of commands, at which point you transition from simple commands to scripting. Commonly, the recommended scripting language is Bash as it is the native shell on Linux. However, any scripting language will help you as you learn DevOps. You can learn PowerShell or Python and still accomplish much of what you need to do. The key is to learn how to automate tasks that you would otherwise do manually without your mouse. Bash, PowerShell, and Python are all cross-platform choices. Practice navigating your file system, managing installed apps, and running your build commands from the command line.</p>
<h3 id="continuous-integration-and-delivery"><a aria-hidden="true" tabindex="-1" href="#continuous-integration-and-delivery"><span class="icon icon-link"></span></a>Continuous integration and delivery</h3>
<p>The best-known acronym in DevOps is CI/CD, which stands for continuous integration and continuous delivery (or deployment). As a developer, you may already be using CI/CD pipelines to build and test your code. It may be tied to your source control platform, such as GitHub Actions or Azure DevOps Pipelines, or GitLab CI/CD, or it may be a standalone system like Jenkins. This is likely the first tooling primarily associated with DevOps that you will start authoring as you learn the role of DevOps engineer. However, a pipeline in and of itself is not CI/CD. You can write a pipeline that copies source code to a server, but that does not give you continuous integration, delivery, or deployment. Continuous integration is improved through pipelines that compile code consistently, run tests to verify changes, or enforce quality through additional checks like linters and static code analyzers. Continuous delivery is about when your pipelines produce deployment-ready artifacts that are reusable and ready to deploy to any environment. Continuous deployment is achieved when your pipelines automatically deploy code to your environments without human intervention. A pipeline is a tool, but CI/CD is a practice and outcome. Learn pipeline tooling, but learn them with the goal of automating the steps needed for CI/CD.</p>
<h3 id="hosting-and-runtime-environments"><a aria-hidden="true" tabindex="-1" href="#hosting-and-runtime-environments"><span class="icon icon-link"></span></a>Hosting and runtime environments</h3>
<p>As you learn pipelines and the concepts of CI/CD, you will also need to understand where your code is run. This can vary widely depending on your organization or application. You may be running on bare-metal servers, virtual machines, containers, or serverless environments. You may be running on-premises or in the cloud. You may be using a platform-as-a-service (PaaS) or infrastructure-as-a-service (IaaS). The key is to understand where your code is run, the benefits and trade-offs of each environment, and how to get your code there. Learning Kubernetes in-depth may help if your organization is using it, but it is overkill for a static website or hobby project. It also doesn't help if your organization isn't using containers. Instead, focus on learning the environment your code is run in already. What operating system is used? What cloud provider? Is there differences between the platform used in development versus production?</p>
<h3 id="infrastructure-as-code"><a aria-hidden="true" tabindex="-1" href="#infrastructure-as-code"><span class="icon icon-link"></span></a>Infrastructure as code</h3>
<p>As you learn where your code is run, you will also need to learn how that environment is created and configured. This is where infrastructure as code (IaC) comes in and the developer skills you already possess can shine. IaC is the practice of defining your hosting and runtime environments through code. Various languages and tools exist for this, such as Terraform, Azure Bicep, Ansible, Pulumi, and PowerShell DSC. The value in IaC is the same as traditional source code: it is versioned, readable, and traceable. If you write something to create a virtual machine and never commit it to a central repository, it is lost. However, if you write a Terraform file to create a VM and commit it to source control, you can track changes, review history, and implement CI/CD practices to validate changes and achieve infrastructure automation. As a developer, you already know how to write code. You can learn IaC and apply your existing skills to an operations domain.</p>
<h2 id="continuous-learning"><a aria-hidden="true" tabindex="-1" href="#continuous-learning"><span class="icon icon-link"></span></a>Continuous learning</h2>
<p>The journey from developer to DevOps engineer is a surprisingly natural evolution. Developers already know their application and the value it delivers. They already know how to write code and collaborate with others. They already know the software lifecycle and the pains of delivering software. Learning DevOps is about expanding their existing knowledge and skills to automate and optimize the concerns outside of developing new features. The best way to learn DevOps is not necessarily learning Linux or Kubernetes, but instead mastering the tools they are already using and expanding knowledge of the whole system. Learn your how your code is built, where it is run, and how it gets there. Automate the friction in the process. When you start there, the mindset of DevOps fits into place:</p>
<p><img src="/assets/_blog/from-dev-to-devops/continuous_learning.jpg" alt="Continuous learning and applications of DevOps knowledge for developers"></p>
<ul>
<li>When you understand your build system, you can optimize your code runtime and <a href="/blog/posts/multi-stage-docker-dotnet-guide">apply containerization efficiently</a>.</li>
<li>When you know source control concepts, you can apply them to infrastructure and pipelines for version control, collaboration, and traceability.</li>
<li>When you possess command-line knowledge, you can automate tasks for test quality and CI/CD pipelines.</li>
<li>When you control your pipelines, you can automate for faster feedback and software delivery.</li>
<li>When you understand your hosting environment, you can optimize for scalability and apply effective deployment strategies.</li>
<li>When you write high quality code, you can apply the same principles to infrastructure, pipeline, and test code.</li>
</ul>
<p>DevOps is not a set of tools or a team, but a fuzzier concept: a mindset and shared responsibility. The path to learning DevOps is likewise non-exact. The concepts and tooling mentioned are how I started to learn DevOps as a developer. Your path may be different, but the key is to start with what you know and use today. From there, you learn the adjacent concepts, the tooling, and the why behind it all. And then, you keep learning.</p>]]></content:encoded>
            <category>devops</category>
            <enclosure url="https://victorfrye.com/assets/_blog/from-dev-to-devops/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Beer City Code 2025]]></title>
            <link>https://victorfrye.com/blog/posts/beer-city-code-2025</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/beer-city-code-2025</guid>
            <pubDate>Mon, 11 Aug 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[Cheers to the Grand Rapids developer community! Thank you to the attendees, organizers, sponsors, and my fellow speakers for making this event a success.]]></description>
            <content:encoded><![CDATA[<p>Thank you to the greater Grand Rapids developer community for an exceptional Beer City Code 2025! This year, I was honored to speak at my local conference for the first time, and it was such a joy to attend and connect with so much of the talent and passion in our community. I look forward to seeing how we can continue to grow and support each other in the future. Cheers!</p>
<p>This blog post is a reference for those who attended my presentations as well as a thank you letter to everyone involved.</p>
<h2 id="aspire-to-effortless"><a aria-hidden="true" tabindex="-1" href="#aspire-to-effortless"><span class="icon icon-link"></span></a>Aspire to effortless</h2>
<p>My first presentation was titled "Aspire to Effortless" and focused on Aspire, the coolest topic in software development. I shared its invitation to a modern .NET platform, what it is, and how the developer dashboard and orchestrator provide an effortless local development experience. For those interested in learning more or finding the resources available, I recommend reading more on my blog and following along as I continue to explore Aspire. Some helpful links and resources include:</p>
<ul>
<li><a href="https://github.com/victorfrye/presentations/blob/main/files/2025/beercitycode/aspiretoeffortless.pdf">Presentation slide deck</a></li>
<li><a href="https://github.com/victorfrye/crudcounter">Sample CRUD app repository</a></li>
<li><a href="/blog/posts/aspire-roadmap-2025">Aspire roadmap blog post</a></li>
<li><a href="https://aka.ms/aspire-discord">Aspire Discord server</a></li>
</ul>
<h2 id="ai-for-dummies"><a aria-hidden="true" tabindex="-1" href="#ai-for-dummies"><span class="icon icon-link"></span></a>AI for dummies</h2>
<p>My second presentation was titled "AI for Dummies" and focused on an overview of artificial intelligence, the value of AI as a service, and how to get started with Azure AI services for building applications without a data science degree. I shared my own experiences building the Mocking Mirror application that uses Azure OpenAI in a .NET application to provide a simple, yet powerful AI experience. I recommend reading more here on my blog or the Leading EDJE blog for staying up to date on the latest in AI. Some links worth checking out include:</p>
<ul>
<li><a href="https://github.com/victorfrye/presentations/blob/main/files/2025/beercitycode/aifordummies.pdf">Presentation slide deck</a></li>
<li><a href="https://github.com/victorfrye/mockingmirror">Mocking Mirror repository</a></li>
<li><a href="https://blog.leadingedje.com">Leading EDJE blog</a></li>
</ul>
<h2 id="thank-you"><a aria-hidden="true" tabindex="-1" href="#thank-you"><span class="icon icon-link"></span></a>Thank you</h2>
<p>Thank you again if you attended my presentations or any of the other amazing sessions at Beer City Code 2025. I hope to see you again next year at the 2026 event! Also thank you to the organizers and volunteers who made this event possible. And a special thank you to the sponsors. These are the companies helping to make this event happen and investing in the future Grand Rapids developer community:</p>
<ul>
<li><a href="https://bizstream.com">BizStream</a></li>
<li><a href="https://meijer.com">Meijer</a></li>
<li><a href="https://instructlab.ai">InstructLab</a></li>
<li><a href="https://arrayofengineers.com">Array of Engineers</a></li>
<li><a href="https://progress.com">Progress</a></li>
<li><a href="https://kentico.com">Kentico</a></li>
<li><a href="https://umbraco.com">Umbraco</a></li>
</ul>]]></content:encoded>
            <category>dotnet</category>
            <category>aspire</category>
            <category>ai</category>
            <category>azure</category>
            <category>events</category>
            <enclosure url="https://victorfrye.com/assets/_blog/beer-city-code-2025/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Aspire Roadmap 2025]]></title>
            <link>https://victorfrye.com/blog/posts/aspire-roadmap-2025</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/aspire-roadmap-2025</guid>
            <pubDate>Wed, 30 Jul 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[A look at the recently published Aspire roadmap for 2025, focusing on its code-first DevOps evolution, polyglot aspirations, and AI orchestration, and more.]]></description>
            <content:encoded><![CDATA[<p>The Aspire team has recently published their <a href="https://github.com/dotnet/aspire/discussions/10644">2025 roadmap</a>, revealing an exciting evolution from local development orchestration to a comprehensive framework for DevOps concerns. <a href="/blog/posts/hello-aspire-breaking-down-key-features">Aspire</a> launched with a code-first application model and instantaneous run experience, then expanded into deploy scenarios with publishers. This roadmap shows how it's becoming a complete code-first alternative to YAML-heavy DevOps toolchains while embracing polyglot development and AI workload orchestration.</p>
<p>While these are aspirational goals rather than firm commitments, they provide valuable insight into Aspire's direction. Let's explore the most compelling features and why they position Aspire as a game-changing DevOps framework for .NET, polyglot, and AI applications.</p>
<h2 id="code-first-devops"><a aria-hidden="true" tabindex="-1" href="#code-first-devops"><span class="icon icon-link"></span></a>Code-first DevOps</h2>
<p>DevOps combines development (Dev) and operations (Ops) to deliver software faster and with higher quality. While DevOps is fundamentally about people and processes, the technology and tooling often involve tedious YAML configuration files for CI/CD pipelines and infrastructure management. Aspire is changing this by providing a code-first approach to local development, testing, and deployment, replacing configuration complexity with familiar programming languages.</p>
<h3 id="local-development"><a aria-hidden="true" tabindex="-1" href="#local-development"><span class="icon icon-link"></span></a>Local development</h3>
<p>Aspire already excels at code-first application modeling in C#, expressing your entire architecture—databases, services, .NET projects, and polyglot components—then spinning it up locally with <code>aspire run</code>. No YAML configuration files, just standard .NET code that ideally mirrors your production architecture. The roadmap expands this with:</p>
<ul>
<li><strong>Improved container support</strong>: Shell commands, scripts, and interactive debugging inside containers</li>
<li><strong>Multi-repo support</strong>: Native orchestration across multiple repositories</li>
<li><strong>Built-in runtime acquisition</strong>: Automatic installation of Node.js, .NET, and other required runtimes</li>
</ul>
<p>Aspire local development is a mature feature set already. These improvements focus on further simplifying the developer experience and tackling complex orchestration scenarios. Multi-repo support has been a long-standing pain point as many developers opt to separate components, like a frontend and backend, in separate repositories. Removing the monorepo requirement or custom cross-repo orchestration makes Aspire more accessible to many teams. You can already run polyglot applications in containers with Aspire, but continued improvements will allow more robust debugging and feedback loops with local containerized applications. The built-in runtime acquisition is both the most exciting and most daunting feature here. It may simplify the first run experience, which helps with onboarding and CI/CD pipelines and one area that I adore of Aspire. However, depending on its implementation, it could also lead to extra local machine complexity with Aspire managed runtimes versus system-wide runtimes. The local development experience is already fantastic and delivers the code-first developer experience Aspire promises. Therefore, I am optimistic that these improvements will build on that foundation.</p>
<h3 id="testing"><a aria-hidden="true" tabindex="-1" href="#testing"><span class="icon icon-link"></span></a>Testing</h3>
<p>Aspire's code-first model and instant run experience create ideal conditions for integration and end-to-end testing. You can spin up your entire application stack locally, creating an instant integration test environment with minimal friction. The <code>Aspire.Hosting.Testing</code> package provides this test host for xUnit and other testing frameworks and allows you to benefit from Aspire features like intelligent resource state notifications that eliminate arbitrary sleep times in tests. The roadmap adds advanced testing capabilities:</p>
<ul>
<li><strong>Partial app host execution</strong>: Run only specific components in tests to reduce overhead</li>
<li><strong>Request redirection and mocking</strong>: Control traffic between components for chaos engineering</li>
<li><strong>Code coverage support</strong>: Coverage collection and reporting for integration tests</li>
</ul>
<p>Where local development is the mature foundation of Aspire, testing is currently a secondary benefit that often surprises users by revealing the true value of the framework. These improvements take the Aspire testing story to the next level. Aspire goes from being the startup tooling that manages your integration testing components to a chaos engineering and middleware validation powerhouse. The partial app host execution isn't limited to testing and reduces overhead in local development scenarios where certain components are not needed. In testing, this partial execution may allow each test to receive the benefit so API integrations can be isolated without starting up the frontend or further broken down to individual microservices that matter. Coupled together with request redirection and mocking of components, you could create test scenarios that simulate real-world failures between integrations and validate chaos behavior. Imagine chaos testing your application before you even deploy it from your machine with the same ease of unit testing. The code coverage support is the extra bonus reward: get code coverage metrics for your integration and chaos tests in a way that is often limited to unit tests? Yes, please! The roadmap suggests the current Aspire testing story is only in its infancy, and these improvements will make it a reason to adopt Aspire for testing alone if they materialize as envisioned.</p>
<h3 id="deployment"><a aria-hidden="true" tabindex="-1" href="#deployment"><span class="icon icon-link"></span></a>Deployment</h3>
<p>Deployment bridges development and customer value delivery. Aspire's local orchestration model naturally extends to cloud deployment scenarios. Aspire has been expanding to include deployment targets and publishers, simplifying the process of getting your application into production.
Currently, Aspire publishes artifacts like Bicep, Docker Compose, and Kubernetes manifests. You can deploy any Aspire resource the same way you would without Aspire, but with it you get seamless delivery to deployment targets like Azure Container Apps. While deployment targets are limited and opinionated, the roadmap addresses key enterprise needs that are still missing:</p>
<ul>
<li><strong>Additional deployment targets</strong>: Support for Azure App Service, Azure Functions, and improved Docker/Kubernetes workflows</li>
<li><strong>Environment support</strong>: Define dev/stage/prod environments with specific configurations and secrets</li>
<li><strong>CI/CD pipeline generation</strong>: Auto-generate GitHub Actions, Azure DevOps, and GitLab pipelines</li>
</ul>
<p>Deployment is an emerging focus in the Aspire story. Azure Container Apps is the first focus for deployment target and flexible as a hosting platform, but it's not flexible enough for all enterprise scenarios even within corporate environments invested in Azure. The roadmap as expected promises more common Azure deployment targets for traditional .NET workloads like Azure App Service and Azure Functions, but it is still lacking in amazing <a href="/blog/posts/reviewing-aspirejs#deployment-targets">polyglot deployment targets like Azure Static Web Apps</a>. Environment support is critical for enterprise adoption as the majority of enterprises host multiple environments. DevOps practices may push us for consistency between environments, but there are always differences in configuration and secrets to isolate environments. The CI/CD pipeline generation in addition to environment support delivers on the idea of code-first DevOps: define your environments and application model in code, then generate the necessary pipelines to deploy it based on your code-first model. The overall deployment story is still evolving, but the question that will persist is whether Aspire can provide enough flexibility to meet the diverse needs of enterprises' existing applications. These features are a step in that direction. I hope the Aspire team delivers, and we see Aspire become a code-first framework for continuous delivery and deployment.</p>
<h2 id="polyglot-aspirations"><a aria-hidden="true" tabindex="-1" href="#polyglot-aspirations"><span class="icon icon-link"></span></a>Polyglot aspirations</h2>
<p>Aspire is not just a .NET framework; it is a polyglot orchestration framework that allows you to model and run conjoined applications in various languages. .NET, JavaScript, Python, and more are all supported, but the only first-class experience is in .NET projects. With the app host authored in C#, the service defaults project providing .NET best practices, and NuGet client integrations for simplifying configuration in your application code, Aspire is an amazing .NET developer experience. You can <a href="/blog/posts/reviewing-aspirejs">host JavaScript</a> and Python applications, but you don't get the same level of integration and tooling. The roadmap reveals the Aspire team's ambition to provide a first-class polyglot experience that extends beyond .NET:</p>
<ul>
<li><strong>Uniform client integrations</strong>: Connection strings, configuration, and telemetry work consistently with new language support via npm (JavaScript) and pip (Python) packages</li>
<li><strong>Templates and samples</strong>: Quickstarts and documentation for C#, JavaScript, and Python</li>
<li><strong>Cross-language app host</strong>: Experimental WebAssembly support for multiple runtimes in a single process</li>
</ul>
<p>The polyglot aspirations of Aspire are focusing on JavaScript and Python support first. The uniform client integrations with npm packages for JavaScript and other languages will get us closer to parity with the .NET experience. Improved documentation and more polyglot samples will also help as figuring out how to use Aspire currently relies on developers doing the translation between C# and other languages themselves. Technically a hosting integration, but if Aspire supports the <code>Aspire.Hosting.Testing</code> package in JavaScript I would be ecstatic. Documentation and packages together could elevate the polyglot experience to make Aspire stand out beyond traditional .NET developers. It may invite more developers to experiment with the .NET platform beyond Aspire as well.</p>
<p>The cross-language app host is a fascinating item and the one I find hardest to envision myself. Will this be a way to write the app host without .NET? Will it wrap all the runtimes in a single process on your computer? What will it actually look like? The roadmap tells us it is experimental, so it may never materialize, or it may be something we start to see soon. I will be watching this closely as it starts to take shape and the value becomes clearer.</p>
<h2 id="artificial-intelligence"><a aria-hidden="true" tabindex="-1" href="#artificial-intelligence"><span class="icon icon-link"></span></a>Artificial intelligence</h2>
<p>While AI dominates software conversations, Aspire has focused on fundamental developer experience improvements rather than AI-first features. As AI applications continue to be mainstream, Aspire is positioned to apply its orchestration strengths to AI workloads. The roadmap outlines several AI-specific features:</p>
<ul>
<li><strong>Token usage visualization</strong>: Real-time token counts, latency, and evaluation metadata in the dashboard</li>
<li><strong>LLM-specific metrics</strong>: Native support for generative AI telemetry, including model name, temperature, and function call traces</li>
<li><strong>Azure AI Foundry</strong>: Integration for building agent-based applications</li>
<li><strong>Aspire MCP server</strong>: Optional runtime endpoint exposing the Aspire model as an MCP server for AI agents</li>
</ul>
<p>Building AI applications is itself a nascent discipline. The Aspire team appears to be taking a measured approach to AI integration instead of branding itself another set of AI-native tools. These Aspire AI features are focused on two key areas: observability and agents. Observability is another area Aspire already excels at with the Aspire dashboard. Token usage and LLM-specific metrics visualizations in the Aspire dashboard will be a wonderful addition to the existing telemetry and observability features. It stays true to the natural value of Aspire while also extending to needs of AI local development needs.</p>
<p>In agentic regards, Aspire works but has a lot of limitations. Existing AI integrations, like Azure OpenAI and Ollama, provide some options for local and cloud-hosted LLMs. The integration with Azure AI Foundry may extend the catalog and options for LLMs. It will be exceptionally interesting if the integration supports Azure AI Foundry Local capabilities to provide a unified catalog of models both locally and in the cloud. The Aspire MCP server likewise adds agentic capabilities to Aspire. Model Context Protocol (MCP) is becoming an industry standard for AI agents communicating, understanding, and interacting with outside systems. An Aspire MCP server could provide development tools like GitHub Copilot with deep context on your application model and all the resources Aspire manages. I am all for more intelligent development workflows. Like so many other technologies, Aspire is targeting AI trends and trying to provide its own value in the space.</p>
<h2 id="aspire-tooling"><a aria-hidden="true" tabindex="-1" href="#aspire-tooling"><span class="icon icon-link"></span></a>Aspire tooling</h2>
<p>As Aspire evolves into a mature framework, its tooling ecosystem continues expanding beyond the core .NET SDK. The roadmap includes several improvements:</p>
<ul>
<li><strong>Aspire CLI</strong>: Continued improvements and unified commands</li>
<li><strong>WinGet and Homebrew installers</strong>: Standard install support for Windows and macOS</li>
<li><strong>VS Code extension</strong>: Run, debug, and orchestrate polyglot Aspire applications in VS Code</li>
</ul>
<p>The tooling of Aspire is a meta story and so are its roadmap items. The code-first DevOps value and the polyglot aspirations, they all deliver on a core premise of Aspire: a simplified developer experience. When the tooling to setup Aspire or interact with it isn't easy, the core premise is lost. The Aspire CLI has already started this meta story with my favorite command, <code>aspire run</code>, which provides a consistent way to run your Aspire hosted applications locally. Continued improvements to the CLI and other commands will help make it easier to adopt and utilize Aspire. The WinGet and Homebrew installers are similar in value and may simplify installing the Aspire CLI which is already more complex than it should be. Finally, the VS Code extension may help deliver on the polyglot aspirations of Aspire by making development with Aspire more accessible to the tools JavaScript and Python developers already use without relying on CLI knowledge. Sure, CLI commands mean you can do it today but installing the Aspire CLI and generating projects requires a <a href="/blog/posts/adding-aspire-cli-guide">guide of the right CLI commands</a>. Overall, the meta story of these tools is to simplify using Aspire so that Aspire can simplify your developer experience.</p>
<h2 id="final-thoughts"><a aria-hidden="true" tabindex="-1" href="#final-thoughts"><span class="icon icon-link"></span></a>Final thoughts</h2>
<p>The <a href="https://github.com/dotnet/aspire/discussions/10644">2025 roadmap</a> that the Aspire team published is an exciting glimpse into a rapidly evolving framework. Nothing is a commitment, but the vision tells a story of what Aspire is developing into: a code-first DevOps framework that simplifies local development, testing, and deployment while embracing polyglot development and AI orchestration. I am incredibly excited by this roadmap as it aligns with my own dreams for Aspire. I love what it is today and recommend it to every .NET developer and some polyglot developers. If the Aspire team can deliver on half of these features, it will only continue to be a game-changer for developing distributed applications.</p>
<p>Let me know what you think of Aspire and where it is going. Are you excited about the roadmap? Do you think Aspire can deliver on these promises? I would love to hear your thoughts and experiences with Aspire so far.</p>]]></content:encoded>
            <category>aspire</category>
            <category>cloudnative</category>
            <category>dotnet</category>
            <category>javascript</category>
            <category>azure</category>
            <category>devops</category>
            <category>ai</category>
            <enclosure url="https://victorfrye.com/assets/_blog/aspire-roadmap-2025/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Reviewing Aspire.JS]]></title>
            <link>https://victorfrye.com/blog/posts/reviewing-aspirejs</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/reviewing-aspirejs</guid>
            <pubDate>Wed, 30 Jul 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[A review of Aspire for JavaScript, an orchestration framework for polyglot applications, including current state, personal experiences, and future aspirations.]]></description>
            <content:encoded><![CDATA[<p>Aspire is the coolest thing in software development right now. That's a statement I frequently make, but it comes from a place of genuine excitement for this nascent framework that is transforming how we can model, run, and deploy applications. Local development with Aspire is effortless regardless of the complexity of your architecture. Aspire is a part of the .NET platform, but it extends past .NET to provide polyglot orchestration for JavaScript, Python, and other languages.</p>
<p>One reason I refer to Aspire as the coolest thing in software development is my frequent use of it in JavaScript projects. Whether it's a simple static site or a full-stack application, Aspire has become my go-to tool for local development.</p>
<p>This post is a review of Aspire for JavaScript including current state, my personal experiences, and future aspirations. If you are a JavaScript, .NET, or polyglot developer interested in Aspire, this analysis is for you. If you are not familiar with Aspire, you may want to read a <a href="/blog/posts/hello-aspire-breaking-down-key-features">breakdown of its key features</a> first.</p>
<h2 id="current-state"><a aria-hidden="true" tabindex="-1" href="#current-state"><span class="icon icon-link"></span></a>Current state</h2>
<p>Aspire in its current state is a code-first orchestration framework written in C# but enabling local development and hosting of polyglot applications. JavaScript and .NET exist in harmony with Aspire. Given the common stack of a JavaScript web frontend, a .NET web API backend, and a containerized database or other backing services, Aspire hosts the entire stack and abstracts away the different mechanisms for running and connecting each component.</p>
<p>The above stack is the apparent assumption of Aspire for JavaScript. Aspire allows for modeling and running JavaScript backends, full-stack JavaScript applications, and scripts, but given the .NET-first nature of Aspire, you will be writing some C# code if you use Aspire. Accepting this for the app host, the orchestrator and C# model, Aspire provides two extension points of note for JavaScript development: Integration packages and deployment targets.</p>
<h3 id="integrations-packages"><a aria-hidden="true" tabindex="-1" href="#integrations-packages"><span class="icon icon-link"></span></a>Integrations packages</h3>
<p>Integration packages are the libraries that extend Aspire to support the various projects, executables, containers, and services that make up your application. These packages are hosted as NuGet packages and can either be hosting or client integrations. Hosting integrations extend the app host of Aspire to model and run components like a JavaScript web app. Client integrations are libraries that allow you to consume the Aspire configurations and defaults to connect to the hosted components. However, client integrations are exclusive to .NET projects given they are packed as NuGet packages.</p>
<p>This still leaves a variety of hosting integrations for JavaScript development to benefit from. The following are some of the most relevant:</p>
<ul>
<li><strong><a href="https://www.nuget.org/packages/Aspire.Hosting.NodeJs">Aspire.Hosting.NodeJs</a></strong>: Provides hosting for Node.js applications via node or npm scripts</li>
<li><strong><a href="https://www.nuget.org/packages/CommunityToolkit.Aspire.Hosting.Bun">CommunityToolkit.Aspire.Hosting.Bun</a></strong>: Provides hosting for Bun applications</li>
<li><strong><a href="https://www.nuget.org/packages/CommunityToolkit.Aspire.Hosting.Deno">CommunityToolkit.Aspire.Hosting.Deno</a></strong>: Provides hosting for Deno applications</li>
<li><strong><a href="https://www.nuget.org/packages/Aspire.Hosting.PostgreSQL">Aspire.Hosting.PostgreSQL</a></strong>: Provides hosting for PostgreSQL database via <a href="https://hub.docker.com/_/postgres">Docker Hub registry postgres images</a></li>
<li><strong><a href="https://www.nuget.org/packages/Aspire.Hosting.MongoDB">Aspire.Hosting.MongoDB</a></strong>: Provides hosting for MongoDB database via <a href="https://hub.docker.com/_/mongo">Docker Hub registry mongo images</a></li>
<li><strong><a href="https://www.nuget.org/packages/Aspire.Hosting.Redis">Aspire.Hosting.Redis</a></strong>: Provides hosting for Redis via <a href="https://hub.docker.com/_/redis">Docker Hub registry redis images</a></li>
<li><strong><a href="https://www.nuget.org/packages/Aspire.Hosting.Azure.Storage">Aspire.Hosting.Azure.Storage</a></strong>: Provides hosting for Microsoft Azure cloud storage services, including Blob, Queue, Table, and Azurite emulation</li>
<li><strong><a href="https://www.nuget.org/packages/Aspire.Hosting.Testing">Aspire.Hosting.Testing</a></strong>: Provides a test host for .NET unit testing frameworks</li>
</ul>
<p>Overall, the packages provide flexibility of hosting JavaScript apps with the Node.js, Bun, or Deno runtimes. Databases and cloud service hosting include popular JavaScript data solutions like PostgreSQL, MongoDB, Redis, and Azure Blob Storage. Together, Aspire still provides the same local development experience for JavaScript application as it does for .NET with instantaneous runs and abstractions over other configuration files. The major deficits are you must minimally write C# code for the Aspire app host and to consume the Aspire host for testing, you will need to write integration tests using .NET testing frameworks like xUnit. For true polyglot developers familiar with both JavaScript and .NET, this is a non-issue that provides all the benefits of Aspire with the flexibility of using JavaScript for what JavaScript is best at. However, for a JavaScript only developer, there are extra barriers to entry here that Aspire has yet to solve.</p>
<h3 id="deployment-targets"><a aria-hidden="true" tabindex="-1" href="#deployment-targets"><span class="icon icon-link"></span></a>Deployment targets</h3>
<p>Deployment is an emerging focus in the Aspire story. Applications orchestrated with Aspire can be deployed anywhere the same way you would deploy without Aspire, however the focus here is deployment through Aspire. Aspire is expanding to include publishers and deployment targets, taking your modeled application and using it to generate artifacts like Bicep and container images. Given Aspire's origins in .NET and Microsoft solutions, the initial deployment targets are opinionated and limited. By default, the easiest deployment target is <a href="https://learn.microsoft.com/en-us/azure/container-apps/overview">Azure Container Apps</a>, a serverless platform for running containerized applications. However, there is a fundamental flaw here for JavaScript developers: hosting with Azure Containers Apps assumes you need a server.</p>
<p>JavaScript developers are accustomed to a true serverless experience, one in which the web browser is the host environment. Frameworks like Next.js allow for server-side computation, but many JavaScript frameworks and applications are designed to run entirely in the browser using a bundle of JavaScript, HTML, and CSS. This has a lot of benefits for developers, including:</p>
<ul>
<li><strong>No server management</strong>: No need to manage servers or containers, just a static file host</li>
<li><strong>Instant scaling</strong>: Static files can be served from a CDN, scaling automatically with demand</li>
<li><strong>Lower costs</strong>: Static file hosting is often cheaper than running containers or VMs</li>
</ul>
<p>And so much more. This is antithetical to traditional .NET development and represents a fundamental difference in JavaScript versus .NET. Some JavaScript developers may opt for containerized hosting due to enterprise infrastructure or for self-managed static web servers like Nginx, but Azure already provides a first-class static web hosting solution with <a href="https://learn.microsoft.com/en-us/azure/static-web-apps/overview">Azure Static Web Apps</a>. Azure Static Web Apps are nowhere to be found in the Aspire deployment story, which is a major gap for Aspire for JavaScript.</p>
<h2 id="my-aspirejs-story"><a aria-hidden="true" tabindex="-1" href="#my-aspirejs-story"><span class="icon icon-link"></span></a>My Aspire.JS story</h2>
<p>To understand how Aspire fits JavaScript development currently and potentially in the future, it is helpful to understand how a developer who has adopted Aspire already uses it. I am a full-stack developer who currently favors .NET for backend development, React for frontend development, and Azure for cloud hosting. I started using Aspire for a sample .NET web API that I wanted to run on macOS and Windows, so that anyone could pull down the code and run it with minimal configuration. Aspire was perfect for this, so I started using it for all my .NET projects. This in turn led me to use Aspire to host a React frontend alongside my web API and database, which also proved to be effortless. Finally, I asked the question: <em>Why not use Aspire for my JavaScript only projects?</em></p>
<p>I have 3 static sites that I maintain, including my personal blog, and I wanted to use Aspire for local development to provide a consistent local development experience across all my projects. It works. Every personal project, including live websites or demo applications, is locally orchestrated with Aspire by default. I also recommend it for any enterprise .NET project I work on. However, the barrier of recommendation stops at projects that do not include .NET components currently. Aspire is currently an excellent choice for .NET and polyglot projects that include .NET, but the benefits of Aspire for JavaScript only or polyglot projects without .NET are not an easy sell. The C# app host, NuGet client integrations, and lack of polyglot deployment targets that do not align are all barriers to entry for JavaScript developers.</p>
<p>The C# app host is a non-issue for me as a .NET developer, but for any project not already using .NET, it means extra SDKs to install and a new language to learn. Admittedly, the app host is not complex C# code until you start creating your own custom components. It is the download of the .NET SDK that is the high barrier. The NuGet client integrations are less a barrier and more a missing feature to sell the value story. Finally, deployment targets are a nascent feature in a nascent framework. I started using Aspire without its deployment features due to feature immaturity. To this day, I favor Azure Container Apps or Azure Functions for .NET workloads and Azure Static Web Apps for JavaScript workloads and handle deployment separately from Aspire. Together, this means the Aspire story for non-.NET applications adds .NET as a development dependency and is missing client integrations and deployment flexibility I would expect for recommendation.</p>
<h2 id="future-aspirations"><a aria-hidden="true" tabindex="-1" href="#future-aspirations"><span class="icon icon-link"></span></a>Future aspirations</h2>
<p>Aspire recently published their <a href="/blog/posts/aspire-roadmap-2025">2025 roadmap</a>, which includes several features that may solve the current limitations of Aspire for JavaScript. The most exciting are:</p>
<ul>
<li><strong>Polyglot client integrations</strong>: Connection strings, configuration, and telemetry work consistently via npm packages as they do with existing NuGet packages for .NET projects</li>
<li><strong>Templates and samples</strong>: More documentation and quickstart examples for JavaScript</li>
<li><strong>Cross-language app host</strong>: An experimental WebAssembly app host that may reduce .NET friction for JavaScript developers authoring the Aspire app host</li>
</ul>
<p>These features may further make Aspire an accessible choice for JavaScript developers and provide some of the .NET exclusive benefits for JavaScript components in your applications. The npm client integration packages excite me the most as a polyglot developer because it would allow me to integrate databases and cloud services like Azure Storage into my JavaScript components with the same reduced configuration as Aspire is doing for .NET projects. This adds parity in developer experience and closes the gap for recommendation and adoption of Aspire for JavaScript development. Documentation improvements are also always welcome and ease adoption. The cross-language app host is interesting, but I am still unsure of what it may amount to or if it'll even materialize. If it does, maybe it removes the .NET SDK download as a barrier. These features are directional and not commitments but provide a hope for increased parity with Aspire .NET developer experience.</p>
<p>The remaining gap is deployment. This is an emerging area and the .NET story itself is still evolving for deploying with Aspire. However, I am actively watching for how this matures and the targets that get first-class support. If static hosting targets like Azure Static Web Apps are added, Aspire for JavaScript becomes a much more compelling recommendation. If Aspire only provides first-class support for traditionally .NET hosting targets like Azure App Service, Azure Functions, and Azure Container Apps, then the polyglot story remains incomplete.</p>
<h2 id="final-assessment"><a aria-hidden="true" tabindex="-1" href="#final-assessment"><span class="icon icon-link"></span></a>Final assessment</h2>
<p>Aspire is the coolest thing in software development right now and is actively evolving. For polyglot developers familiar with .NET, Aspire is a game-changer and you should experiment with adding it yourself. However, for JavaScript development and polyglot applications without .NET, there are still barriers to entry that prevent Aspire from being a compelling recommendation. Can you do it? Absolutely. Do I use Aspire for JavaScript development? Yes. Do I recommend it for JavaScript only projects? Not yet. But maybe in the future. Maybe soon.</p>]]></content:encoded>
            <category>aspire</category>
            <category>cloudnative</category>
            <category>dotnet</category>
            <category>javascript</category>
            <category>azure</category>
            <enclosure url="https://victorfrye.com/assets/_blog/reviewing-aspirejs/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[dotnet run file.cs]]></title>
            <link>https://victorfrye.com/blog/posts/dotnet-run-file</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/dotnet-run-file</guid>
            <pubDate>Mon, 23 Jun 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[Let's explore the dotnet run file.cs paradigm for writing file-based .NET applications and the new value it brings to the .NET ecosystem.]]></description>
            <content:encoded><![CDATA[<p>I missed something at <a href="https://victorfrye.com/blog/posts/microsoft-build-2025-wrapped">Microsoft Build 2025</a>: the announcement of the new <code>dotnet run file.cs</code> model in <a href="https://devblogs.microsoft.com/dotnet/dotnet-10-preview-4/">.NET 10 Preview 4</a>. This is a new paradigm for running and writing .NET applications and if you are reading this, you might not be the target of this feature. However, you will probably meet or read C# code that is written this way.</p>
<p>This article will explore the new feature of <code>dotnet run file.cs</code> and the value it brings to the .NET ecosystem. Run it!</p>
<h2 id="the-current-project-based-model"><a aria-hidden="true" tabindex="-1" href="#the-current-project-based-model"><span class="icon icon-link"></span></a>The Current Project-Based Model</h2>
<p>Today, if I wanted to write a simple C# console application that outputs "Hello, World!", I need to do the following:</p>
<ol>
<li>Install the .NET SDK.</li>
<li>Install an IDE or text editor like Visual Studio or Visual Studio Code.</li>
<li>Create a new .NET project using the IDE or the <code>dotnet new</code> CLI command.</li>
<li>Write my code in the <code>Program.cs</code> file.</li>
</ol>
<p>None of this is changing, or at least the steps. However, the output of this today is as given the command: <code>dotnet new console --name HelloWorld</code>:</p>
<pre><code class="language-text">File actions would have been taken:
  Create: ./HelloWorld.csproj
  Create: ./Program.cs

Processing post-creation actions...
Action would have been taken automatically:
   Restore NuGet packages required by this project
</code></pre>
<p>The above is the dry run output of the <code>dotnet new</code> command. Notice two files are created: <code>HelloWorld.csproj</code> and <code>Program.cs</code>. The <code>csproj</code> file is an XML file that contains information any .NET developer is all too familiar with. The <code>Program.cs</code> file is where I write my code. Additionally, you will quickly see <code>obj</code> and <code>bin</code> directories created and start populating as you write and publish your application. Do you know what both directories are for, even today? Do you find XML friendly to read? Microsoft asked a new question: Is this all overwhelming for someone new?</p>
<p>The keyword above was <strong>new</strong>. I invite you to recall your days learning to code and suppress your experienced instincts. When I do, I remember sitting in a classroom feeling like I might never understand programming and would fail. C# was my first language. We have bootcamps, universities, and online courses in excess to help new developers. That is working, but they are learning JavaScript or Python. Why? Because the onboarding experience is easier. The barrier to entry lower.</p>
<p>What if this changed? Introducing the new <code>dotnet run file.cs</code> paradigm.</p>
<h2 id="the-new-file-based-model"><a aria-hidden="true" tabindex="-1" href="#the-new-file-based-model"><span class="icon icon-link"></span></a>The New File-Based Model</h2>
<p>The <code>dotnet run</code> we keep discussing is the Dotnet CLI command any .NET command-line user is familiar with. However, the <code>file.cs</code> is in reference to a new single file-based application model. That means in our steps from earlier, we change them to the following:</p>
<ol>
<li>Install the .NET SDK.</li>
<li>Install an IDE or text editor like Visual Studio Code.</li>
<li>Create a new C# file, e.g. <code>hello.cs</code>.</li>
<li>Write my code in the <code>hello.cs</code> file.</li>
</ol>
<p>The steps are incredibly similar but also simplified. You need the .NET SDK and a tool for writing code still, but you no longer need to understand a complex project generation process and only have one file to manage. Let's review it:</p>
<pre><code class="language-csharp">#!/usr/bin/dotnet run
#:sdk Microsoft.NET.Sdk.Web
#:property AssemblyName VictorFrye.HelloWorld

var app = WebApplication.CreateBuilder(args).Build();

app.UseHttpsRedirection();

app.MapGet("/hello", () => "Hello World!");

await app.RunAsync();
</code></pre>
<p>That is it. I could link to a repository, but if you copy and paste this you get a complete .NET application you can run. There is no <code>csproj</code> file, and <code>obj</code> and <code>bin</code> directories are not created in your working directory. And if you run the command <code>dotnet run hello.cs</code>, you get an active Kestrel web server that responds with "Hello World!". The latter half of the code is top-level statements, a feature not so new. However, the first three lines are special.</p>
<p>The first line is a shebang: a Unix convention that tells the system how to execute the file. In this case, it tells the system to use the <code>dotnet run</code> command to execute the file. With this new paradigm, you must have the .NET SDK installed and Dotnet CLI available still. A shebang is not required, but it does enable running the file without explicitly calling <code>dotnet run</code> on Unix-like systems. This is cool, but mostly just a convenience.</p>
<p>The second and third lines are new directives. You may be using directives in your code today, such as <code>#if DEBUG</code> or <code>#region Feature X</code>. However, the new <code>#:</code> directives are unique to the run file paradigm. The <code>.csproj</code> file normally tells our .NET application critical information like SDKs, MSBuild properties, or NuGet packages to use. The run file paradigm still supports these, but instead you use a <code>#:sdk</code> directive or <code>#:property</code> directive. In this case, I'm using the <code>Microsoft.NET.Sdk.Web</code> SDK to pull in ASP.NET Core features for web APIs and setting the assembly name to <code>VictorFrye.HelloWorld</code> because I like my name. These new directives are only for the run file paradigm, and you will get warnings if you try to use them in a traditional project model.</p>
<p>Behind the scenes, everything is still there. The project file still exists but is virtual and interpreted by the Dotnet CLI. The <code>obj</code> and <code>bin</code> directories are created, but in a temporary location that is abstracted away. The application is still built and run like any other .NET application. The difference is in the simplicity of authoring C#. However, when the project reaches maturity or someone is ready to take it to the next level, they can convert the file-based application to a traditional project-based application. All you must do is run the following:</p>
<pre><code class="language-bash">dotnet project convert hello.cs
</code></pre>
<h2 id="the-value-added"><a aria-hidden="true" tabindex="-1" href="#the-value-added"><span class="icon icon-link"></span></a>The Value Added</h2>
<p>I am really excited about the <code>dotnet run file.cs</code>. The primary users targeted are new developers. This is a win if Microsoft succeeds and more developers embrace modern .NET applications. Some might be concerned about not learning all the details of the full project-based application model, but new developers learning .NET mean a larger .NET community, new libraries, and more innovation in the ecosystem. This is a huge win for the .NET developer community.</p>
<p>However, the value added doesn't stop there. File-based applications are also great for scripts and small utility apps. You don't need a folder structure or a csproj file. You can now write a couple C# scripts to help you maintain your existing codebase or automate tasks. This is a huge win for scripting capabilities and reducing project overhead.</p>
<p>Another use-case is one you might have to read yourself: .NET samples. Sample applications are used by libraries to showcase how to use specific features or APIs. They are also used by conference speakers and at meetups to illustrate concepts or provide live demos of features. In this article itself, I would normally have to create a full project to demonstrate the feature, and I would link the repository so a reader could copy it exactly and reference it or run it themselves. Now, I can provide the entire sample in a code block that is easy to copy and paste. This is a huge win for documentation and sample authors.</p>
<h2 id="the-limitations-so-far"><a aria-hidden="true" tabindex="-1" href="#the-limitations-so-far"><span class="icon icon-link"></span></a>The Limitations So Far</h2>
<p>Right now, file-based applications are limited to a single file. They are also unsupported in Visual Studio, favoring Visual Studio Code as a more likely editor for targeted users. Finally, it is only in .NET 10 Preview versions at the moment. It will not be until November 2025 that we see the first general availability release of file-based applications and likely time after before we see new developers learning in this form or a C# scripting revolution.</p>
<h2 id="concluding-remarks"><a aria-hidden="true" tabindex="-1" href="#concluding-remarks"><span class="icon icon-link"></span></a>Concluding Remarks</h2>
<p>The <code>dotnet run file.cs</code> paradigm is a new way to write and run .NET applications. It may or may not be for you, but the goal is a more inclusive and accessible .NET ecosystem. The best outcome is more developers learning and using .NET. Maybe C# scripts take off and we see C# become the new Python. Maybe documentation and sample applications get less verbose. The future is hard to predict, but I am hopeful for a future where I see file-based C# applications in the wild.</p>]]></content:encoded>
            <category>dotnet</category>
            <enclosure url="https://victorfrye.com/assets/_blog/dotnet-run-file/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Adding .NET Aspire]]></title>
            <link>https://victorfrye.com/blog/posts/adding-aspire-cli-guide</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/adding-aspire-cli-guide</guid>
            <pubDate>Wed, 04 Jun 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[A CLI enthusiast's guide to adding Aspire to your .NET projects using the dotnet CLI. Plus, how to install the aspire CLI for running Aspire applications.]]></description>
            <content:encoded><![CDATA[<p>.NET Aspire is a new framework for building cloud-native distributed applications that simplifies local development and deployment. Aspire offers significant benefits through its application modeling and orchestration capabilities. But how do you actually add Aspire to your projects?</p>
<p>In this guide, I'll walk through command line options to adding Aspire into both new and existing applications. Whether you're starting from scratch or enhancing an established project, the dotnet CLI provides templates to generate projects, new solutions, or common standalone files. This post focuses on the Aspire project templates and another useful tool, the aspire CLI, using command lines for those interested in checking out .NET Aspire.</p>
<p>Let's dive into adding Aspire to your .NET projects through the command line!</p>
<h2 id="adding-aspire-to-your-project"><a aria-hidden="true" tabindex="-1" href="#adding-aspire-to-your-project"><span class="icon icon-link"></span></a>Adding Aspire to your project</h2>
<p>You can do all of this in Visual Studio or the dotnet CLI, but I will be guiding with CLI commands as they are universal across all OSes and IDEs. Also, CLI is just more fun, right?</p>
<p>With Aspire, you will have two new projects in your solution: <code>AppHost</code> and <code>ServiceDefaults</code>. These can have any name, but these names are the standard. The <code>AppHost</code> project is your local environment entrypoint and where you will model your application. The <code>ServiceDefaults</code> sets up default configurations deemed best practice for distributed applications. To add these, you can use the Aspire project templates. Using the dotnet CLI, you can run the following command options:</p>
<h3 id="install-aspire-templates"><a aria-hidden="true" tabindex="-1" href="#install-aspire-templates"><span class="icon icon-link"></span></a>Install Aspire templates</h3>
<p>First thing you'll have to do is install the Aspire project templates. This will provide the necessary scaffolding for any of the other commands below to work. You can do this by running the following command in your terminal:</p>
<pre><code class="language-bash">dotnet new install Aspire.ProjectTemplates
</code></pre>
<p>You may have to append a <code>@X.X.X</code> version and/or the <code>--force</code> option to install a newer version of the templates.</p>
<h3 id="option-1-add-apphost-and-servicedefaults"><a aria-hidden="true" tabindex="-1" href="#option-1-add-apphost-and-servicedefaults"><span class="icon icon-link"></span></a>Option 1: Add AppHost and ServiceDefaults</h3>
<p>To add both the <code>AppHost</code> and <code>ServiceDefaults</code> projects to your existing solution, you can run the following command:</p>
<pre><code class="language-bash">dotnet new aspire --name "MyAspireApp" --output .
</code></pre>
<p>This provides a both new projects. The <code>--name</code> option specifies the root solution name and both projects will be created as <code>MyAspireApp.AppHost</code> and <code>MyAspireApp.ServiceDefaults</code> in this example. The <code>--output</code> option specifies the current directory as the output location, but you can change this to any directory you prefer such as <code>src</code>.</p>
<h3 id="option-2-add-apphost-only"><a aria-hidden="true" tabindex="-1" href="#option-2-add-apphost-only"><span class="icon icon-link"></span></a>Option 2: Add AppHost only</h3>
<p>If you only want to start with the <code>AppHost</code> project, you can run the following template option:</p>
<pre><code class="language-bash">dotnet new aspire-apphost --name "MyAspireApp.AppHost" --output "AppHost"
</code></pre>
<p>This will create only the <code>AppHost</code> project. This provides the local orchestration and entrypoint for Aspire applications. For the above, the <code>--name</code> option now specifies the fully qualified project name, <code>MyAspireApp.AppHost</code>, and the <code>--output</code> option specifies a new directory called <code>AppHost</code> to contain these specific project files.</p>
<h3 id="option-3-add-servicedefaults-only"><a aria-hidden="true" tabindex="-1" href="#option-3-add-servicedefaults-only"><span class="icon icon-link"></span></a>Option 3: Add ServiceDefaults only</h3>
<p>If you want to start with just the <code>ServiceDefaults</code> project, you can run the following command:</p>
<pre><code class="language-bash">dotnet new aspire-servicedefaults --name "MyAspireApp.ServiceDefaults" --output "ServiceDefaults"
</code></pre>
<p>This will create only the <code>ServiceDefaults</code> project. This project is a shared .NET library that contains useful defaults for distributed applications, such as HTTP resiliency and health check configuration. The <code>--name</code> option specifies the fully qualified project name, <code>MyAspireApp.ServiceDefaults</code>, and the <code>--output</code> option specifies a new directory called <code>ServiceDefaults</code> to contain these specific project files.</p>
<h3 id="option-4-start-a-new-solution-with-aspire"><a aria-hidden="true" tabindex="-1" href="#option-4-start-a-new-solution-with-aspire"><span class="icon icon-link"></span></a>Option 4: Start a new solution with Aspire</h3>
<p>One final option is the Aspire starter template. This will create a new solution with both the <code>AppHost</code> and <code>ServiceDefaults</code> projects, plus a Blazor web frontend and web API backend service. You can run the following command to create a new solution with all of these projects:</p>
<pre><code class="language-bash">dotnet new aspire-starter --name "MyAspireApp" --output .
</code></pre>
<p>The resulting solution will contain at least four projects in a new solution:</p>
<ul>
<li><code>MyAspireApp.AppHost</code>: The main entry point for the application.</li>
<li><code>MyAspireApp.ServiceDefaults</code>: A library of default services and configurations.</li>
<li><code>MyAspireApp.Web</code>: A Blazor web frontend.</li>
<li><code>MyAspireApp.ApiService</code>: An ASP.NET web API backend service.</li>
</ul>
<h2 id="installing-the-aspire-cli"><a aria-hidden="true" tabindex="-1" href="#installing-the-aspire-cli"><span class="icon icon-link"></span></a>Installing the Aspire CLI</h2>
<p>Okay, now that we have the Aspire projects set up, there's one more thing we might want: the aspire CLI. The aspire CLI is a preview tool that simplifies application startup. To install it, you can run the following command:</p>
<pre><code class="language-bash">dotnet tool install --global aspire.cli --prerelease
</code></pre>
<p>After installing the CLI, you can run the <code>aspire --help</code> command to see usage and available options. Effectively, this CLI is a wrapper around the <code>dotnet run</code> command that removes the need to specify the <code>--project</code> option. It will automatically find the <code>AppHost</code> project in your solution and run it.</p>
<h2 id="next-steps"><a aria-hidden="true" tabindex="-1" href="#next-steps"><span class="icon icon-link"></span></a>Next Steps</h2>
<p>At this point, you should have the Aspire projects you want in your solution and the aspire CLI installed. You can now start modeling your application in the <code>AppHost</code> project and customize the <code>ServiceDefaults</code> project to fit your needs.</p>]]></content:encoded>
            <category>aspire</category>
            <category>cloudnative</category>
            <category>dotnet</category>
            <enclosure url="https://victorfrye.com/assets/_blog/adding-aspire-cli-guide/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[.NET Options Pattern]]></title>
            <link>https://victorfrye.com/blog/posts/dotnet-options-pattern</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/dotnet-options-pattern</guid>
            <pubDate>Wed, 04 Jun 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[Let's explore the .NET Options pattern and how it can be used to bind application settings and user secrets in your application.]]></description>
            <content:encoded><![CDATA[<p>Configuration management is a crucial aspect of building applications. Whether locally or in the cloud, you need a way to manage settings and secrets without hardcoding values everywhere. In .NET, the options pattern is a powerful out-of-the-box approach to binding configuration values from just about any source, including JSON files, environment variables, user secrets, and even cloud configuration providers.</p>
<p>In this article, we'll explore the .NET options pattern, how to set it up in your application, and some key features for flexibility in your application.</p>
<h2 id="what-is-the-options-pattern"><a aria-hidden="true" tabindex="-1" href="#what-is-the-options-pattern"><span class="icon icon-link"></span></a>What is the Options Pattern?</h2>
<p>The options pattern is a .NET design pattern that provides a way to bind configuration values to strongly typed classes. This allows us to take advantage of object-oriented programming principles, such as type safety and separation of concerns, while also extracting values from various configuration stores. In .NET, the options pattern is implemented through the <code>Microsoft.Extensions.Options</code> and <code>Microsoft.Extensions.Configuration</code> libraries, already included in frameworks like ASP.NET Core, and the base <code>IOptions&#x3C;TOptions></code> interface.</p>
<p>I consider the options pattern table stakes for any .NET application as it cleanly separates application code from configuration. Local development can use JSON files and user secrets, while production can use remote configuration stores like Azure App Configuration and Key Vault, all without changing your code. You can even mix and match local and remote sources for dev/test integrated local development. This flexibility significantly improves developer productivity and application maintainability.</p>
<h2 id="setting-up-the-options-pattern"><a aria-hidden="true" tabindex="-1" href="#setting-up-the-options-pattern"><span class="icon icon-link"></span></a>Setting Up the Options Pattern</h2>
<p>To get started with the options pattern, you need 3 things:</p>
<ol>
<li>A settings class that represents the configuration values you want to bind.</li>
<li>A configuration source, such as your <code>appsettings.json</code> file or user secrets.</li>
<li>A binding and registration of the settings class.</li>
</ol>
<p>In this example, we'll be binding a simple settings class for configuration values related to calling a downstream API. You can find the complete code for this example in the <a href="https://github.com/victorfrye/hellooptions">GitHub repository</a>.</p>
<h3 id="the-settings-class"><a aria-hidden="true" tabindex="-1" href="#the-settings-class"><span class="icon icon-link"></span></a>The Settings Class</h3>
<p>When defining your settings classes, you will want to split apart your settings based on their consumption. This allows us to separate the concerns of different parts of our application configuration and later will make it easier to consume using dependency injection. In this case, we have only one settings class named <code>PlaceholderApiSettings</code> that contains a base URL, an API key, and an optional version number:</p>
<pre><code class="language-csharp">
using System.ComponentModel.DataAnnotations;

public sealed class PlaceholderApiSettings
{
    internal const string SectionName = $"{nameof(PlaceholderApiSettings)}";

    [Required]
    public required Uri BaseUrl { get; set; }

    [Required]
    public required string ApiKey { get; set; }

    public string Version { get; set; } = "v1";
}
</code></pre>
<p>You will notice that we also have a constant for the section name. Since we might have multiple settings classes, it is a good idea to define sections in our configuration to avoid overlapping settings. Additionally, we are using the <code>Required</code> attribute and <code>required</code> keyword to denote we expect these properties to always be set, whether via constructor or binding. These are all optional elements.</p>
<p>The requirements for a settings class are simple: we need field names that express our expected configuration sources to also use.</p>
<h3 id="the-configuration-sources"><a aria-hidden="true" tabindex="-1" href="#the-configuration-sources"><span class="icon icon-link"></span></a>The Configuration Sources</h3>
<p>The options pattern binds configuration values from any source supported by <code>Microsoft.Extensions.Configuration</code>. That includes:</p>
<ul>
<li>Your <code>appsettings.json</code> file</li>
<li>Environment variables</li>
<li>.NET Secrets Manager (aka user secrets)</li>
<li>Azure App Configuration</li>
<li>Azure Key Vault</li>
<li>And more!</li>
</ul>
<p>For our example, we will use the <code>appsettings.json</code> file and user secrets. Recalling our section name in the settings class, we can define a section and two field values in our <code>appsettings.json</code> file like this:</p>
<pre><code class="language-json">{
  // Other configuration settings...
  "PlaceholderApiSettings": {
    "BaseUrl": "https://jsonplaceholder.typicode.com/todos",
    "Version": "v1",
  }
}
</code></pre>
<p>We can nest our settings under our section name or nest that under further prefixes for any settings we may want to group together. We are creating a fully qualified path to each setting field for any configuration source we bind equivalent to what we are expecting, <code>PlaceholderApiSettings:BaseUrl</code> for example. This value might differ by environment, but as long as we have one value set per environment by any configuration source available in that environment, we can bind it to our settings class.</p>
<p>Sometimes we have sensitive information that we don't want to commit in source code. Here, we can use the .NET Secrets Manager to store and still bind sensitive values like API keys. To do this, we can use the <code>dotnet user-secrets</code> command group in the dotnet CLI. If you haven't already initialized user secrets for your project, you can do so with the following command:</p>
<pre><code class="language-bash">dotnet user-secrets init --project "Path/To/Your/Project.csproj"
</code></pre>
<p>This command will add a <code>UserSecretsId</code> property to your project file, which uniquely identifies your project. This ID allows you to store secrets safely outside of your source code.</p>
<p>From here, we can now add our API key to the secrets file with the following command:</p>
<pre><code class="language-bash">dotnet user-secrets set "PlaceholderApiSettings:ApiKey" "SuperSecretApiKey123" --project "Path/To/Your/Project.csproj"
</code></pre>
<p>This command will create or update the <code>secrets.json</code> file with the key-value pair for our API key. The resulting file will look something like this:</p>
<pre><code class="language-json">{
  "PlaceholderApiSettings:ApiKey": "SuperSecretApiKey123"
}
</code></pre>
<p>Now we have two configuration sources with the various values set.</p>
<h3 id="binding-the-options"><a aria-hidden="true" tabindex="-1" href="#binding-the-options"><span class="icon icon-link"></span></a>Binding the Options</h3>
<p>The final step towards setting up the options pattern is binding and registration of our settings class. This is where the <code>Microsoft.Extensions.Options</code> library comes into play. In our <code>Program.cs</code> entry point, we need to add the options, configure the binding source, and, optionally, validate the settings to ensure they meet our expectations. To do this, we can use the <code>AddOptions</code>, <code>BindConfiguration</code>, and <code>ValidateDataAnnotations</code> method chain on the <code>IServiceCollection</code>:</p>
<pre><code class="language-csharp">var builder = WebApplication.CreateBuilder(args);

builder.Services.AddOptions&#x3C;PlaceholderApiSettings>()
                .BindConfiguration(PlaceholderApiSettings.SectionName)
                .ValidateDataAnnotations();

// Registration of other services and configuration...

var app = builder.Build();

// The rest of your application setup...

await app.RunAsync();
</code></pre>
<p>The <code>AddOptions&#x3C;T></code> method gets the options builder for our settings class, allowing us to configure it. The <code>BindConfiguration</code> method takes the section name from our settings class, telling the options builder the configuration section path to bind to (in this case, <code>PlaceholderApiSettings:*</code>). Finally, the optional <code>ValidateDataAnnotations</code> method tells the options builder to use the data annotations we defined to validate the values during binding. You can use other validation methods, such as <code>ValidateOnStart</code> or <code>Validate</code> to perform validation at different times or with custom logic. The key is we have a strongly typed settings class and a section path that aligns with the keys in our various configuration sources.</p>
<p>After this setup, we now have multiple new classes registered in our dependency injection container that we can use to access our settings.</p>
<h2 id="consuming-the-options"><a aria-hidden="true" tabindex="-1" href="#consuming-the-options"><span class="icon icon-link"></span></a>Consuming the Options</h2>
<p>We have defined our settings class, configured our sources, and bound our options. Now we need to consume them in our application. The options pattern has already three interfaces implemented that we can use to access our settings:</p>
<ul>
<li><code>IOptions&#x3C;T></code>: A singleton instance bound with the settings values at start up.</li>
<li><code>IOptionsSnapshot&#x3C;T></code>: A scoped instance that can be used to access settings in transient or per-request services and can be configured to reload on changes.</li>
<li><code>IOptionsMonitor&#x3C;T></code>: A singleton instance that can be used to access settings singleton scenarios and also supports change notifications.</li>
</ul>
<p>Instances of all three interfaces are pre-configured in our dependency injection container by the <code>AddOptions</code> method chain and can be used in any service through constructor injection. The base use case is the <code>IOptions</code> interface, though the later two support advanced scenarios like refreshing settings during runtime. In our example, we will use the <code>IOptions&#x3C;PlaceholderApiSettings></code> to access our settings in client class for our placeholder API.</p>
<p>Using dependency injection, we can retrieve our settings now like this:</p>
<pre><code class="language-csharp">using Microsoft.Extensions.Options;

public class PlaceholderClient(IOptions&#x3C;PlaceholderApiSettings> options)
{
  private readonly PlaceholderApiSettings _settings = options.Value;

  // Omitted methods for brevity
}
</code></pre>
<p>Through constructor injection of the options interface of our choice, we can access the bound settings class and all its members through the <code>Value</code> property. Now, we can use the <code>_settings</code> field to access any of the properties we defined include the base URL from our JSON source and API key from our user secrets source in this <code>PlaceholderClient</code>.</p>
<h2 id="conclusion"><a aria-hidden="true" tabindex="-1" href="#conclusion"><span class="icon icon-link"></span></a>Conclusion</h2>
<p>The .NET options pattern is a simple, unified way to manage and consume configuration values in our applications. By defining custom settings classes, we get the benefits of type safety and the separation of concerns. We are able to consume values from multiple configuration sources like our application settings or user secrets locally and remote configuration stores like Azure App Configuration or Key Vault in the cloud. Finally, by binding our settings to configuration paths, we can easily access them through dependency injection and reap the benefits of inversion of control in our application. All together, this design pattern offers a flexible foundation for managing configuration values in .NET applications.</p>
<p>If you want to explore more about the options pattern, you can find additional resources and extensions in the official <a href="https://learn.microsoft.com/en-us/dotnet/core/extensions/options">Microsoft Learn documentation</a>.</p>]]></content:encoded>
            <category>dotnet</category>
            <enclosure url="https://victorfrye.com/assets/_blog/dotnet-options-pattern/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Local Friendly .NET Aspire]]></title>
            <link>https://victorfrye.com/blog/posts/local-friendly-aspire-modeling</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/local-friendly-aspire-modeling</guid>
            <pubDate>Wed, 04 Jun 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[Let's explore the .NET Aspire app host and how modeling your local environment builds a better developer experience.]]></description>
            <content:encoded><![CDATA[<p>.NET Aspire is a new framework for building cloud-native and distributed applications. It brings a <a href="https://victorfrye.com/blog/posts/hello-aspire-breaking-down-key-features">host of key features as I've previously discussed</a>, but my two favorites are the ability to model your application stack in C# code and the instant local development environment you get after doing so.</p>
<p>In this post, I'll explore these features through two sample applications: a classic CRUD application with containerized backing services and an AI prototype that relies on cloud services. Both showcase how Aspire solves the complexity of running distributed applications locally. You can find the code for these examples on GitHub at <a href="https://github.com/victorfrye/crudcounter">victorfrye/crudcounter</a> and <a href="https://github.com/victorfrye/mockingmirror">victorfrye/mockingmirror</a>.</p>
<h2 id="modeling-your-application"><a aria-hidden="true" tabindex="-1" href="#modeling-your-application"><span class="icon icon-link"></span></a>Modeling your application</h2>
<p>To start modeling our application, we need an Aspire app host project. This is a .NET project that uses the Aspire SDK to define the resources and integrations needed for our application. I cover how to add an Aspire app host project in my <a href="https://victorfrye.com/blog/posts/adding-aspire-cli-guide">command line guide</a>.</p>
<p>Here I'll show you two app host projects. Two files make up the core of the Aspire app host: a <code>Program.cs</code> file and a <code>.csproj</code> file. These can be extended to your pleasure, but these two files are the current minimum. The <code>csproj</code> file will provide our SDK and hosting integration package references, while the <code>Program.cs</code> will be the C# code for modeling and serve as the entrypoint for our local application going forward.</p>
<h3 id="crud-application"><a aria-hidden="true" tabindex="-1" href="#crud-application"><span class="icon icon-link"></span></a>CRUD application</h3>
<p>First, the <code>csproj</code> file for our CRUD application host looks like this:</p>
<pre><code class="language-xml">&#x3C;Project Sdk="Microsoft.NET.Sdk">

  &#x3C;Sdk Name="Aspire.AppHost.Sdk" Version="9.3.0" />

  &#x3C;PropertyGroup>
    &#x3C;OutputType>Exe&#x3C;/OutputType>
    &#x3C;TargetFramework>net9.0&#x3C;/TargetFramework>
    &#x3C;ImplicitUsings>enable&#x3C;/ImplicitUsings>
    &#x3C;Nullable>enable&#x3C;/Nullable>

    &#x3C;AssemblyName>VictorFrye.CrudCounter.AppHost&#x3C;/AssemblyName>
    &#x3C;RootNamespace>VictorFrye.CrudCounter.AppHost&#x3C;/RootNamespace>

    &#x3C;UserSecretsId>2bad5002-9943-41cd-9a77-ec579ba4a680&#x3C;/UserSecretsId>
  &#x3C;/PropertyGroup>

  &#x3C;ItemGroup>
    &#x3C;PackageReference Include="Aspire.Hosting.AppHost" Version="9.3.0" />
    &#x3C;PackageReference Include="Aspire.Hosting.NodeJs" Version="9.3.0" />
    &#x3C;PackageReference Include="Aspire.Hosting.Redis" Version="9.3.0" />
    &#x3C;PackageReference Include="Aspire.Hosting.SqlServer" Version="9.3.0" />
  &#x3C;/ItemGroup>

  &#x3C;ItemGroup>
    &#x3C;ProjectReference Include="..\WebApi\WebApi.csproj" />
  &#x3C;/ItemGroup>

  &#x3C;Target Name="RestoreNpm" BeforeTargets="Build" Condition=" '$(DesignTimeBuild)' != 'true' ">
    &#x3C;ItemGroup>
      &#x3C;PackageJsons Include="..\WebClient\package.json" />
    &#x3C;/ItemGroup>

    &#x3C;!-- Install npm packages if node_modules is missing -->
    &#x3C;Message Importance="Normal" Text="Installing npm packages for %(PackageJsons.RelativeDir)"
      Condition="!Exists('%(PackageJsons.RootDir)%(PackageJsons.Directory)/node_modules')" />
    &#x3C;Exec Command="npm install" WorkingDirectory="%(PackageJsons.RootDir)%(PackageJsons.Directory)"
      Condition="!Exists('%(PackageJsons.RootDir)%(PackageJsons.Directory)/node_modules')" />
  &#x3C;/Target>

&#x3C;/Project>
</code></pre>
<p>There are four callouts here:</p>
<ol>
<li>The <code>Sdk</code> element specifies we are using the Aspire SDK for this project. This extends our .NET SDK for Aspire app hosting capabilities.</li>
<li>The <code>ProjectReference</code> element includes a reference to my existing ASP.NET Core Web API project. Aspire integrates seamlessly with other .NET projects.</li>
<li>The <code>PackageReference</code> item group includes a mix of dependencies for hosting our various services, e.g. Node.js for the web client, Redis for caching, and SQL Server for our database.</li>
<li>The <code>RestoreNpm</code> target, a nasty bit of build magic that I'm using to ensure all npm packages are installed for the web client for first-time repository runs.</li>
</ol>
<p>You could get away with only the <code>Sdk</code> element, but why reinvent the wheel? Referencing existing .NET projects is commonplace in the ecosystem already. You could build your own resource models, but prebuilt packages from Microsoft are less work and at this stage we probably don't even know how. Finally, that <code>RestoreNpm</code> target is a huge convenience for our end-goal: a single step to run our entire application.</p>
<p>Okay, but what about the actual C# code? Let's look at the <code>Program.cs</code> file now:</p>
<pre><code class="language-csharp">var builder = DistributedApplication.CreateBuilder(args);

var sql = builder.AddSqlServer("sql")
                 .AddDatabase("db");

var cache = builder.AddRedis("cache");

var api = builder.AddProject&#x3C;Projects.WebApi>("api")
                 .WithReference(sql)
                 .WaitFor(sql)
                 .WithReference(cache)
                 .WaitFor(cache)
                 .WithHttpHealthCheck("/alive")
                 .WithExternalHttpEndpoints();

builder.AddNpmApp("client", "../WebClient", "dev")
       .WithReference(api)
       .WaitFor(api)
       .WithEnvironment("NEXT_PUBLIC_API_BASEURL", api.GetEndpoint("https"))
       .WithHttpEndpoint(env: "PORT");

await builder.Build().RunAsync();
</code></pre>
<p>This code models our entire application stack in a few lines:</p>
<ul>
<li>Creates backing infrastructure, i.e. SQL Server and Redis cache as containers</li>
<li>References our .NET web API project and configures it to wait for dependencies</li>
<li>Adds our JavaScript frontend and connects it to the API</li>
<li>Orchestrates everything to run together</li>
</ul>
<p>Without Aspire, this would require multiple manual steps: starting containers, launching the API, configuring the frontend with the right API URL, and hoping everything connects correctly.</p>
<p>With Aspire, we do it in a single step in three forms: press F5 in your IDE, run <code>dotnet run --project "Path/To/MyAspireApp.AppHost"</code> in your terminal, or <code>aspire run</code> with the Aspire CLI.</p>
<p>If you chose to install the Aspire CLI and run that way, you are greeted with a beautiful output like this:</p>
<pre><code class="language-plaintext">Dashboard:
📈  https://localhost:17280/login?t=af9ee87cf516605f0639052a6731c2d0

╭──────────┬───────────────────────────┬─────────┬────────────────────────╮ 
│ Resource │ Type                      │ State   │ Endpoint(s)            │ 
├──────────┼───────────────────────────┼─────────┼────────────────────────┤ 
│ api      │ Project                   │ Running │ https://localhost:7558 │ 
│          │                           │         │ http://localhost:5556  │ 
│ cache    │ Container                 │ Running │ tcp://localhost:60771  │ 
│ client   │ Executable                │ Running │ http://localhost:60772 │ 
│ db       │ SqlServerDatabaseResource │ Running │ None                   │ 
│ sql      │ Container                 │ Running │ tcp://localhost:60773  │ 
╰──────────┴───────────────────────────┴─────────┴────────────────────────╯ 
Press CTRL-C to stop the app host and exit.
</code></pre>
<p>Regardless of using the Aspire CLI or not, you will see a dashboard in your terminal or pop out in your browser. This is the Aspire dashboard, which provides a web portal to your local environment with a resource table or graph, console logs for each, traces for requests between resources, and stateful information about each resource.</p>
<p><img src="/assets/_blog/local-friendly-aspire-modeling/crud_dashboard.png" alt="The dashboard of our CRUD application sample with resources for SQL database, Redis, .NET web API, and npm web client."></p>
<p>Okay, but what if not everything is local? Well let's look at the AI sample now.</p>
<h3 id="ai-application"><a aria-hidden="true" tabindex="-1" href="#ai-application"><span class="icon icon-link"></span></a>AI application</h3>
<p>First, the <code>csproj</code> file for our AI application host looks like this:</p>
<pre><code class="language-xml">&#x3C;Project Sdk="Microsoft.NET.Sdk">

  &#x3C;Sdk Name="Aspire.AppHost.Sdk" Version="9.3.0" />

  &#x3C;PropertyGroup>
    &#x3C;OutputType>Exe&#x3C;/OutputType>
    &#x3C;TargetFramework>net9.0&#x3C;/TargetFramework>
    &#x3C;ImplicitUsings>enable&#x3C;/ImplicitUsings>
    &#x3C;Nullable>enable&#x3C;/Nullable>

    &#x3C;AssemblyName>VictorFrye.MockingMirror.AppHost&#x3C;/AssemblyName>
    &#x3C;RootNamespace>VictorFrye.MockingMirror.AppHost&#x3C;/RootNamespace>

   &#x3C;UserSecretsId>353d8321-8cea-41fd-b09b-0503c184b4c8&#x3C;/UserSecretsId>  
  &#x3C;/PropertyGroup>

  &#x3C;ItemGroup>
    &#x3C;PackageReference Include="Aspire.Hosting.AppHost" Version="9.3.0" />
    &#x3C;PackageReference Include="Aspire.Hosting.Azure.CognitiveServices" Version="9.3.0" />
    &#x3C;PackageReference Include="Aspire.Hosting.NodeJs" Version="9.3.0" />
  &#x3C;/ItemGroup>

  &#x3C;ItemGroup>
    &#x3C;ProjectReference Include="..\WebApi\WebApi.csproj" />
  &#x3C;/ItemGroup>

  &#x3C;Target Name="RestoreNpm" BeforeTargets="Build" Condition=" '$(DesignTimeBuild)' != 'true' ">
    &#x3C;ItemGroup>
      &#x3C;PackageJsons Include="..\WebClient\package.json" />
    &#x3C;/ItemGroup>

    &#x3C;!-- Install npm packages if node_modules is missing -->
    &#x3C;Message Importance="Normal" Text="Installing npm packages for %(PackageJsons.RelativeDir)"
      Condition="!Exists('%(PackageJsons.RootDir)%(PackageJsons.Directory)/node_modules')" />
    &#x3C;Exec Command="npm install" WorkingDirectory="%(PackageJsons.RootDir)%(PackageJsons.Directory)"
      Condition="!Exists('%(PackageJsons.RootDir)%(PackageJsons.Directory)/node_modules')" />
  &#x3C;/Target>

&#x3C;/Project>
</code></pre>
<p>Very similar to the CRUD app with the SDK, package references, project references, and a custom target. The main difference is the <code>Aspire.Hosting.Azure.CognitiveServices</code> package reference, which provides the Azure AI services we will be using. And the Program.cs? This gets a bit different:</p>
<pre><code class="language-csharp">var builder = DistributedApplication.CreateBuilder(args);

var oaiName = builder.AddParameter("OpenAIName");
var oaiResourceGroup = builder.AddParameter("OpenAIResourceGroup");
var oaiModel = builder.AddParameter("OpenAIModel");
var speechKey = builder.AddParameter("SpeechKey", secret: true);
var speechRegion = builder.AddParameter("SpeechRegion");

var openai = builder.AddAzureOpenAI("openai")
                    .AsExisting(oaiName, oaiResourceGroup);

var api = builder.AddProject&#x3C;Projects.WebApi>("api")
                 .WithReference(openai)
                 .WaitFor(openai)
                 .WithEnvironment("ChatClientSettings__DeploymentName", oaiModel)
                 .WithEnvironment("SpeechClientSettings__ApiKey", speechKey)
                 .WithEnvironment("SpeechClientSettings__Region", speechRegion)
                 .WithHttpHealthCheck("/alive")
                 .WithExternalHttpEndpoints();

builder.AddNpmApp("client", "../WebClient", "dev")
       .WithReference(api)
       .WaitFor(api)
       .WithEnvironment("NEXT_PUBLIC_API_BASEURL", api.GetEndpoint("https"))
       .WithHttpEndpoint(env: "PORT")
       .WithExternalHttpEndpoints();

await builder.Build().RunAsync();
</code></pre>
<p>In this example, we model external cloud services that can't run locally. We use parameters to extract variables, such as API keys and Azure resource information, from local configuration stores. We use the <code>AsExisting</code> to reference pre-provisioned Azure OpenAI resources with an existing Aspire hosting integration. For Azure AI Speech which does not have an Aspire integration, we simply pass the parameter values as environment variables. This approach connects our web API to backing Azure AI services while still allowing us the convenience of Aspire local orchestration.</p>
<p>There's a lot to consume here, but the key takeaway is you have options. Regardless of whether your local environment is fully isolated or has external dependencies, regardless of if your external dependencies have packages for them already or not, you can start to model out your application with Aspire and we get the same benefits as that CRUD application: we can start up what is needed locally with a single step. Again, I pick <code>aspire run</code>:</p>
<pre><code class="language-plaintext">Dashboard:
📈  https://localhost:17244/login?t=e5f80d45a566a13a40755d0154e1d410

╭─────────────────────┬───────────────────────────┬─────────┬────────────────────────╮                
│ Resource            │ Type                      │ State   │ Endpoint(s)            │                
├─────────────────────┼───────────────────────────┼─────────┼────────────────────────┤                
│ api                 │ Project                   │ Running │ https://localhost:7034 │                
│                     │                           │         │ http://localhost:5170  │                
│ client              │ Executable                │ Running │ http://localhost:63526 │                
│ openai              │ AzureOpenAIResource       │ Running │ None                   │                
│ openai-roles        │ AzureProvisioningResource │ Running │ None                   │                
│ OpenAIModel         │ Parameter                 │ Unknown │ None                   │                
│ OpenAIName          │ Parameter                 │ Unknown │ None                   │                
│ OpenAIResourceGroup │ Parameter                 │ Unknown │ None                   │                
│ SpeechKey           │ Parameter                 │ Unknown │ None                   │                
│ SpeechRegion        │ Parameter                 │ Unknown │ None                   │                
╰─────────────────────┴───────────────────────────┴─────────┴────────────────────────╯                
Press CTRL-C to stop the app host and exit.
</code></pre>
<p><img src="/assets/_blog/local-friendly-aspire-modeling/ai_dashboard.png" alt="The dashboard of our AI application sample with resources for OpenAI, .NET web API, and npm web client."></p>
<p>Our dashboard still gives us a portal to interact with our local environment and states of resources. I get extra information for the Azure OpenAI service as I was able to use an existing hosting integration, despite re-using an existing service. For the Speech service, I don't have to configure anything extra for other projects and can just target the configuration for the AppHost project. My backend and frontend still benefit entirely from the Aspire dashboard and my application is up and running with the press of a button.</p>
<p>The Aspire parameters and Azure configuration use the <a href="https://victorfrye.com/blog/posts/dotnet-options-pattern">options pattern</a> to bind values from configuration sources. The parameters use the <code>Parameters</code> section of your configuration sources, such as app settings, user secrets, or environment variables, while the Azure configuration uses the <code>Azure</code> section. For more on these specifically, you can refer to the Aspire documentation on <a href="https://learn.microsoft.com/en-us/dotnet/aspire/fundamentals/external-parameters">external parameters</a> and <a href="https://learn.microsoft.com/en-us/dotnet/aspire/azure/local-provisioning">Azure local provisioning</a>.</p>
<p>So now we have two different kinds of applications modeled in Aspire, one fully local and one with external dependencies. Both of these applications can be run with a single command, <code>aspire run</code>, and provide a dashboard to interact with them. Cool, right?</p>
<h2 id="why-this-matters"><a aria-hidden="true" tabindex="-1" href="#why-this-matters"><span class="icon icon-link"></span></a>Why this matters?</h2>
<p>This modeling approach aligns perfectly with how we think about our applications, both locally and in deployment. The Aspire app host serves as a bill of materials for our application, defining all integrations and parameters needed to run.</p>
<p>The immediate benefit is a dramatically improved developer experience. New team members can clone the repository, run a single command (<code>aspire run</code>), and have a functional environment without learning setup procedures for each component. When you add new services to your stack, just update the model.</p>
<p>This pre-modeled environment also serves as an integration testing foundation. The <code>Aspire.Hosting.Testing</code> package lets you run your application host in test frameworks like xUnit or MSTest, enabling tests that validate your entire stack or specific components.</p>
<p>Beyond the inner loop, Aspire is addressing deployment scenarios by generating infrastructure as code for Bicep, Terraform, Kubernetes, and more based on your application model with publishers.</p>
<h2 id="final-thoughts"><a aria-hidden="true" tabindex="-1" href="#final-thoughts"><span class="icon icon-link"></span></a>Final thoughts</h2>
<p>The service defaults were what drew me to Aspire initially, but my interest quickly evolved into excitement about the transformative developer experience it offers. The simplicity of modeling your application and running it locally with a single command changes expectations for development inner loop.</p>
<p>All of my projects, including pure JavaScript apps like this blog, now run with Aspire because it's become my new standard. I encourage you to try it in your projects and experience how it reshapes your workflow. This feels like the future of .NET development, and I'll continue exploring Aspire's capabilities in future posts.</p>]]></content:encoded>
            <category>aspire</category>
            <category>cloudnative</category>
            <category>dotnet</category>
            <enclosure url="https://victorfrye.com/assets/_blog/local-friendly-aspire-modeling/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Microsoft Build 2025 Wrapped]]></title>
            <link>https://victorfrye.com/blog/posts/microsoft-build-2025-wrapped</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/microsoft-build-2025-wrapped</guid>
            <pubDate>Thu, 22 May 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[A recap of the most exciting announcements from Microsoft Build 2025 according to your friendly neighborhood developer.]]></description>
            <content:encoded><![CDATA[<p>Microsoft Build 2025 has come and gone, and it has been a whirlwind of announcements, a buzzword storm of AI, and some hidden gems that you might have missed. Let's attempt to wrap up the highlights according to myself, your friendly neighborhood developer.</p>
<h2 id="ai-everywhere"><a aria-hidden="true" tabindex="-1" href="#ai-everywhere"><span class="icon icon-link"></span></a>AI Everywhere</h2>
<p>Microsoft Build 2025 was all about artificial intelligence, with nearly every session including the words "AI", "MCP", or "Copilot". Whether we like it or not, AI is the current trend and Microsoft is all in. The company is integrating AI into nearly every product, from Azure to VS Code, and from Windows to .NET Aspire! That's right, even my favorite new .NET framework is getting the AI treatment. Not only were the sessions all about AI, but the keynote and announcements also had a heavy focus on intelligence and Copilot additions up and down the technology stack. Some of these need their own blog posts, but here are some of the highlights:</p>
<ul>
<li><strong>GitHub Copilot Coding Agent</strong>: A new feature that allows developers to assign tasks to Copilot in the GitHub platform.</li>
<li><strong>GitHub Copilot for .NET Aspire</strong>: The Aspire dashboard now includes Copilot integration, allowing developers to chat with Copilot with Aspire context.</li>
<li><strong>NLWeb</strong>: A new protocol built on top of MCP that allows for natural language to be a first-class citizen in web development.</li>
<li><strong>Azure AI Foundry</strong>: New enhancements to Azure AI Foundry, including Foundry Local in preview—which brings models to your local machine—and Foundry Agent Service going to general availability.</li>
</ul>
<p>I want to focus on the first two items here as they are the two I am closest to and the most excited about.</p>
<h3 id="github-copilot-coding-agent"><a aria-hidden="true" tabindex="-1" href="#github-copilot-coding-agent"><span class="icon icon-link"></span></a>GitHub Copilot Coding Agent</h3>
<p><img src="/assets/_blog/microsoft-build-2025-wrapped/copilot_coding_agent.jpg" alt="GitHub Copilot Coding Agent hero">
<em>Image: Microsoft</em></p>
<p>GitHub Copilot continues to evolve with the introduction of the new Coding Agent. This allows developers to assign development tasks to Copilot in GitHub. From there, Copilot takes over the developer inner loop and handles writing code, running tests, and drafting a pull request to integrate the changes. You can watch the PR draft in real-time to see the changes Copilot pushes and collaborate through comments to suggest changes. It even allows control over the MCP servers used in the development process. This sounds like the automated software engineer pitch, but instead it is being pitched for the tedious work developers often need to do, e.g. upgrading your .NET 8 project to .NET 10 this fall. It is in preview now, but requires a GitHub Copilot Pro+ or Enterprise license currently. Neither of which I have as they are expensive at $390 a year. I might need to review my training budget...</p>
<h3 id="github-copilot-for-net-aspire"><a aria-hidden="true" tabindex="-1" href="#github-copilot-for-net-aspire"><span class="icon icon-link"></span></a>GitHub Copilot for .NET Aspire</h3>
<p><img src="/assets/_blog/microsoft-build-2025-wrapped/copilot_dotnet_aspire.jpg" alt="GitHub Copilot for .NET Aspire hero">
<em>Image: Microsoft</em></p>
<p>Yeah that's right, <a href="https://victorfrye.com/blog/posts/hello-aspire-breaking-down-key-features">.NET Aspire</a> is getting the Copilot treatment too! When launching the app from Visual Studio or Visual Studio Code, you will now see the familiar GitHub Copilot icon in the top right corner of the Aspire dashboard. Using it to test my own website, I asked it get traces and structured logs and it was able to identify that no telemetry exists but my web client application was in the running state. That's correct as I haven't configured any telemetry with the Next.js frontend just yet... I have been meaning to get to that.</p>
<p><img src="/assets/_blog/microsoft-build-2025-wrapped/aspire_dashboard_copilot.png" alt="A screenshot of the .NET Aspire dashboard with GitHub Copilot. Copilot shows a question asked for &#x22;get traces and structured logs&#x22; and responds with a summary of the findings, a root cause investigation to issues found, and suggested next steps."></p>
<p>Immediately, I am both impressed and annoyed. Will I use Copilot here? Probably, given work where I can use GitHub Copilot professionally and we are utilizing .NET Aspire. However it does not work when running from the dotnet CLI nor the aspire CLI. For me, this is a big miss even if it is a technical limitation as I am a command-line enthusiast and my workflows start in the terminal. Secondly, the AI-ification of .NET Aspire means more people may be turned off by the product due to AI fatigue in the industry. But for those fatigued and who want nothing to do with GitHub Copilot, you can disable it, thankfully. You can set the <code>ASPIRE_DASHBOARD_AI_DISABLED</code> environment variable to <code>true</code> in the app host <code>launchSettings.json</code> file to hide all Copilot UI elements.</p>
<pre><code class="language-json">{
  "$schema": "https://json.schemastore.org/launchsettings.json",
  "profiles": {
    "https": {
      "commandName": "Project",
      "dotnetRunMessages": true,
      "launchBrowser": true,
      "applicationUrl": "https://localhost:17168;http://localhost:15027",
      "environmentVariables": {
        // ... other environment variables
        "ASPIRE_DASHBOARD_AI_DISABLED": "true" // Disable GitHub Copilot in Aspire
      }
    }
  }
}
</code></pre>
<h2 id="open-source-commitments"><a aria-hidden="true" tabindex="-1" href="#open-source-commitments"><span class="icon icon-link"></span></a>Open Source Commitments</h2>
<p><img src="/assets/_blog/microsoft-build-2025-wrapped/wsl_cli.png" alt="A screenshot of wsl CLI in Windows Terminal after the help argument was provided."></p>
<p>Microsoft continues to be a major contributor to open source and announced a couple major projects moving from closed-source to the open on GitHub. The first is a long-time coming project, the Windows Subsystem for Linux (WSL). I first used WSL to port a Java stack to Windows. That stack was a nightmare to run on Windows due to a team optimizing for macOS workflows but we wanted to enable new developers to use standard Windows dev machines and stop requiring expensive macOS hardware for a cross-platform native toolchain like Java. Today, WSL is a major part of the Windows developer experience. And now, Microsoft is open-sourcing WSL to allow the community to contribute and innovate on the <a href="https://github.com/microsoft/wsl">project on GitHub</a>.</p>
<p>Another major project moving to open source is the GitHub Copilot Chat extension for Visual Studio Code. As more and more IDEs and text editors are adding AI features, the Copilot Chat extension is being moved to open source <strong>AND</strong> the code being integrated into the Visual Studio Code core codebase. This means the main AI UI experience for Visual Studio Code will become a first-class component of VS Code. Personally, I am excited about this as pushes the AI developer experience towards transparency and competing with other juggernauts like Cursor and Windsurf. This is also another blurred line between the GitHub org and the developer division at Microsoft. It is a small step, but a step in the right direction.</p>
<h2 id="new-command-line-editor"><a aria-hidden="true" tabindex="-1" href="#new-command-line-editor"><span class="icon icon-link"></span></a>New Command Line Editor</h2>
<p><img src="/assets/_blog/microsoft-build-2025-wrapped/edit_about.png" alt="A screenshot of Edit running in Windows Terminal. The README for the project is opened along with the About dialog in the foreground showing the name &#x22;Microsoft Edit&#x22; and version &#x22;1.0.0&#x22; information."></p>
<p>A small footnote in the book of news and not mentioned in the keynote is my favorite announcement from Build: <strong>Edit</strong>. It is a command line text editor, similar to Neovim or Emacs, that pays homage to the classic MS-DOS editor with modern inspiration from VS Code. Edit allows for a modeless command-line interface, meaning you do not have to switch between command and edit modes like in Neovim. This makes it far easier to use for new developers or those unfamiliar with command-line workflows. I have already dropped it into my inner loop in favor of Neovim. There are a couple kinks to work out, but I see a ton of potential and community input to the app. It is already available to install via WinGet and launched with binaries for Windows and Linux. The source code is <a href="https://github.com/microsoft/edit">available on GitHub</a>, and issues and pull requests are already open for various features including macOS support and additional localization support.</p>
<h2 id="closing-thoughts"><a aria-hidden="true" tabindex="-1" href="#closing-thoughts"><span class="icon icon-link"></span></a>Closing Thoughts</h2>
<p>Microsoft Build 2025 was a whirlwind and a lot to digest. There was a ton of AI innovation, some open source announcements, and cool new tools like Edit released. Like every year, where any of these new technologies go is up to community adoption and may change over time. It is also hard to catch everything, so I recommend you check out the <a href="https://news.microsoft.com/build-2025-book-of-news/">Book of News</a> for a full list of announcements from Build. I look forward to seeing which of these technologies resonate with others. Until then, happy coding!</p>]]></content:encoded>
            <category>ai</category>
            <category>aspire</category>
            <category>azure</category>
            <category>cli</category>
            <category>dotnet</category>
            <category>events</category>
            <category>github</category>
            <category>windows</category>
            <enclosure url="https://victorfrye.com/assets/_blog/microsoft-build-2025-wrapped/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Multi-Stage Docker Builds]]></title>
            <link>https://victorfrye.com/blog/posts/multi-stage-docker-dotnet-guide</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/multi-stage-docker-dotnet-guide</guid>
            <pubDate>Fri, 09 May 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[A practical guide to crafting a multi-stage build for production-ready .NET Docker images.]]></description>
            <content:encoded><![CDATA[<p>My Docker skills were getting rusty. My day-to-day work has shifted away from containerized workloads and more towards modernizing legacy systems or architecting serverless solutions. Somehow, I've also never drafted my own Dockerfile from scratch. Docker and containers are culturally synonymous, and both are core cloud native technologies that any modern developer should be familiar with. So, I decided to write my own multi-stage build for a .NET web API for fun. This post will explore the results and guide you based on my learnings.</p>
<p>For this post, I will be using .NET 10 in preview. The application is a simple web API with a single endpoint that returns "Hello, .NET!". You can find the code for this <a href="https://github.com/victorfrye/hellodotnet">here on GitHub</a>.</p>
<h2 id="what-is-a-multi-stage-build"><a aria-hidden="true" tabindex="-1" href="#what-is-a-multi-stage-build"><span class="icon icon-link"></span></a>What is a multi-stage build?</h2>
<p>A traditional Dockerfile might follow the same pattern as a pipeline: You install your tooling, check out your source code, and build your artifacts. It might look something like this:</p>
<pre><code class="language-dockerfile">FROM mcr.microsoft.com/dotnet/sdk:10.0-preview-alpine
WORKDIR /source

COPY src/WebApi/WebApi.csproj src/WebApi/
COPY test/Tests/Tests.csproj test/Tests/
COPY VictorFrye.HelloDotnet.slnx ./
RUN dotnet restore

COPY . .
RUN dotnet build -c Release --no-restore 

RUN dotnet test -c Release --no-build

RUN dotnet publish ./src/WebApi/WebApi.csproj -c Release --no-build -o /app

WORKDIR /app
USER $APP_UID
ENTRYPOINT ["dotnet", "VictorFrye.HelloDotnet.WebApi.dll"]
</code></pre>
<p>The problems with this approach are multifold. Firstly, the image is large. It is over <strong>2 GB</strong> as it contains all the build artifacts, including the .NET SDK and all the source code. The massive means you take a performance hit as the size corresponds to longer build and deploy times and is more expensive for storage and network transfers. Additionally, this is a security risk, as it exposes your source code which may contain intellectual property or sensitive information. We can optimize this all by using a multi-stage build.</p>
<p>Notice our first Dockerfile includes exactly one <code>FROM</code> statement. This means we reused the same base image for our build and runtime. In a multi-stage build, we use multiple <code>FROM</code> statements to separate our stages. This results in distinct images for our build and for production runtime. The build image will utilize the full .NET SDK and all of the source code. The runtime image will only include the ASP.NET runtimes and the artifacts we need to run our application. This results in a smaller production image, faster to build and deploy and with a reduced attack surface.</p>
<h2 id="writing-the-multi-stage-dockerfile"><a aria-hidden="true" tabindex="-1" href="#writing-the-multi-stage-dockerfile"><span class="icon icon-link"></span></a>Writing the multi-stage Dockerfile</h2>
<p>The first thing we need to do is decide on our base images. For production, I know this is an ASP.NET web API and want to keep it slim. We do not want the SDK included and only need that ASP.NET runtime and its dependencies for. As for our Linux flavor, Alpine is my go-to choice as it's stripped down to the essentials and security minded. Keep in mind that Alpine is not always the best choice for every application. Thus, I will be using the <code>mcr.microsoft.com/dotnet/aspnet:10.0-preview-alpine</code> image for our production <strong>base</strong> image. For our <strong>build</strong> image, we want to align architecture to production but need the full .NET SDK. This means we will use the <code>mcr.microsoft.com/dotnet/sdk:10.0-preview-alpine</code> image. For deciding on yours, I recommend browsing the <a href="https://mcr.microsoft.com/">Microsoft Artifact Registry</a>.</p>
<pre><code class="language-dockerfile">FROM mcr.microsoft.com/dotnet/aspnet:10.0-preview-alpine AS base
LABEL com.github.owner="victorfrye"
LABEL com.github.repo="hellodotnet"
USER $APP_UID

FROM mcr.microsoft.com/dotnet/sdk:10.0-preview-alpine AS build
WORKDIR /source
</code></pre>
<p>Notice the <code>AS</code> keyword and the names we have assigned. This is how we can reference our stages later. In the production <strong>base</strong>, we can set additional production configurations including the USER or add labels. Now that we have our base images, we can start to compile our application.</p>
<p>For our <strong>build</strong>, we first need to install our dependencies and run a <code>dotnet restore</code>. To do this, we need to copy our solution and project files into the build image. Remember Docker uses layers and caching to optimize images so small steps in our build create efficiencies.</p>
<pre><code class="language-dockerfile">COPY src/WebApi/WebApi.csproj src/WebApi/
COPY test/Tests/Tests.csproj test/Tests/
COPY VictorFrye.HelloDotnet.slnx ./
RUN dotnet restore
</code></pre>
<p>Next, we need to copy the rest of our source code into the <strong>build</strong> image and build our binaries for release. We also want to explicitly ensure we are not repeating the previous steps.</p>
<pre><code class="language-dockerfile">COPY . .
RUN dotnet build -c Release --no-restore 
</code></pre>
<p>At this point we are wrapping up our initial build stage. Our builder still needs to run our tests and publish, but like a pipeline we can separate these into their own stages. This is good practice as it allows Docker to fail fast and create a logical separation of concerns. Our next stage will be our <strong>test</strong> stage and use the build stage as its base. It will execute our tests without rebuilding and fail the image builder if they do not pass.</p>
<pre><code class="language-dockerfile">FROM build AS test
RUN dotnet test -c Release --no-build
</code></pre>
<p>Now we can move on to our final builder stage: <strong>publish</strong>. This stage will use the <strong>test</strong> stage including the previous build and test steps.  We need to reference the test stage to ensure the full chain of events is executed. The goal of publishing is to output the compiled binaries and dependencies to a directory. They are the artifacts our application needs to run in production. We want to be explicit about our output directory as we will use it in our final image.</p>
<pre><code class="language-dockerfile">FROM test AS publish
RUN dotnet publish ./src/WebApi/WebApi.csproj -c Release --no-build -o /out
</code></pre>
<p>We are almost done! We have our entire builder and within it the artifacts we want. Our builder now ensures a repeatable process and environment for portable consistency. The last step is to copy our artifacts to our production base image as our <strong>final</strong> stage. We will be using that <strong>base</strong> stage we defined at the beginning and copying the artifacts from the <strong>publish</strong> stage. We will then be setting our application entry point to the compiled DLL artifact for our application.</p>
<pre><code class="language-dockerfile">FROM base AS final
WORKDIR /app
COPY --from=publish /out .
ENTRYPOINT ["dotnet", "VictorFrye.HelloDotnet.WebApi.dll"]
</code></pre>
<h2 id="the-final-result"><a aria-hidden="true" tabindex="-1" href="#the-final-result"><span class="icon icon-link"></span></a>The final result</h2>
<p>Putting all our stages together, we still have a single Dockerfile. My result looks like this:</p>
<pre><code class="language-dockerfile">FROM mcr.microsoft.com/dotnet/aspnet:10.0-preview-alpine AS base
LABEL com.github.owner="victorfrye"
LABEL com.github.repo="hellodotnet"
USER $APP_UID

FROM mcr.microsoft.com/dotnet/sdk:10.0-preview-alpine AS build
WORKDIR /source

COPY src/WebApi/WebApi.csproj src/WebApi/
COPY test/Tests/Tests.csproj test/Tests/
COPY VictorFrye.HelloDotnet.slnx ./
RUN dotnet restore

COPY . .
RUN dotnet build -c Release --no-restore 

FROM build AS test
RUN dotnet test -c Release --no-build

FROM test AS publish
RUN dotnet publish ./src/WebApi/WebApi.csproj -c Release --no-build -o /out

FROM base AS final
WORKDIR /app
COPY --from=publish /out .
ENTRYPOINT ["dotnet", "VictorFrye.HelloDotnet.WebApi.dll"]
</code></pre>
<p>This includes multiple stages but benefits from a smaller production image size of <strong>167.5 MB</strong>, down from our initial <strong>2 GB</strong>! That is over a 80% reduction in size and includes none of the source code or build artifacts. It also benefits from faster build times as changes to various stages only require rebuilding the subsequent stages. My favorite part is how similar the structure is to a non-containerized pipeline or GitHub Actions workflow. You move from setup to build to test to publish to production-ready artifacts.</p>
<p>Writing this Dockerfile was a fun exercise and preparation for me in working heavily with Docker again. Hopefully, it also helps you understand the mental process, structure, and benefits of a multi-stage build.</p>]]></content:encoded>
            <category>aspnet</category>
            <category>cloudnative</category>
            <category>devops</category>
            <category>docker</category>
            <category>dotnet</category>
            <enclosure url="https://victorfrye.com/assets/_blog/multi-stage-docker-dotnet-guide/banner.jpg" length="0" type="image/jpg"/>
        </item>
        <item>
            <title><![CDATA[Hello .NET Aspire]]></title>
            <link>https://victorfrye.com/blog/posts/hello-aspire-breaking-down-key-features</link>
            <guid isPermaLink="false">https://victorfrye.com/blog/posts/hello-aspire-breaking-down-key-features</guid>
            <pubDate>Sat, 03 May 2025 05:00:00 GMT</pubDate>
            <description><![CDATA[.NET Aspire is a new framework for building cloud-native and distributed applications. Let's break down the key features.]]></description>
            <content:encoded><![CDATA[<p>.NET Aspire is the latest framework from Microsoft in the .NET ecosystem, adding to ASP.NET, Blazor, Entity Framework, MAUI, etc. Released in 2023, it was designed specifically for cloud-native and distributed applications and acts as an orchestrator for the entire application stack. It is opinionated, meaning it provides a set of conventions and best practices for how to build applications. If you adopt these opinions, Aspire makes the developer experience much smoother and more productive. Some of these key features include:</p>
<ul>
<li><strong>Application modeling</strong>: Aspire allows you to model your application in C# code instead of using YAML or other configuration languages.</li>
<li><strong>Local development</strong>: It provides a seamless local development experience, allowing you to start and stop your entire application with a single command.</li>
<li><strong>Opinionated service defaults</strong>: It provides a set of default configurations and settings for common cloud-native concerns, which can be extended or overridden as needed.</li>
<li><strong>Client integrations</strong>: Aspire includes client libraries and SDKs for common cloud-native services, making it easier to integrate your application with dependencies locally and in the cloud.</li>
<li><strong>Automated testing</strong>: Aspire can be used with existing test frameworks to spin up and tear down your entire application stack for integration or functional tests.</li>
</ul>
<p>In this post, I will provide some background on Aspire and overview of its key features.</p>
<h2 id="brief-history-so-far"><a aria-hidden="true" tabindex="-1" href="#brief-history-so-far"><span class="icon icon-link"></span></a>Brief history so far</h2>
<p>At the initial launch of Aspire, the communication was obtuse about the purpose of this new framework. It was described as "an opinionated, cloud ready stack for building observable, production ready, distributed applications." This was a bit vague and left many developers including myself wondering what exactly it offered. However, as others including myself started to play with it, we quickly realized it was an exciting new way to manage the complexity of building modern applications.</p>
<p>Since then, the Aspire team has been quickly evolving the framework and adding new features based on community reception. This has led to features like the <code>Aspire.Hosting.Testing</code> package for automated testing and the upcoming Aspire CLI.</p>
<h2 id="support-and-updates"><a aria-hidden="true" tabindex="-1" href="#support-and-updates"><span class="icon icon-link"></span></a>Support and updates</h2>
<p>Aspire is unique in that most of the features are not used in production, but instead are used during the local development process and help prepare for production. Additionally, the Aspire team is releasing new features and updates at a rapid pace as a new framework. Because of this, Aspire and the team at Microsoft are shipping updates to Aspire multiple times a year and <em>only the latest version of Aspire is currently supported</em>. This is a culture shock for many .NET developers familiar with the LTS/STS yearly releases of .NET, but as we explore what Aspire is and how it works, you will see this is a good thing and introduces minimal risk to your production applications.</p>
<h2 id="key-features-of-aspire"><a aria-hidden="true" tabindex="-1" href="#key-features-of-aspire"><span class="icon icon-link"></span></a>Key features of Aspire</h2>
<h3 id="application-modeling"><a aria-hidden="true" tabindex="-1" href="#application-modeling"><span class="icon icon-link"></span></a>Application modeling</h3>
<p>Aspire provides a set of abstractions and patterns for modeling your application in C# code. This significantly differs from alternatives like Docker Compose where YAML is used to define your model. The C# code-first approach feels more natural for .NET developers and allows for a similar experience to how you write other application configuration features for startup or dependency injection, but now modeling your external and distributed dependencies.</p>
<p>Using a new <code>AppHost</code> project in your solution, you can model the makeup of your distributed application. Below is a sample <code>Program.cs</code> that includes a ASP.NET Web API, SQL Server database, and Redis cache.</p>
<pre><code class="language-csharp">var builder = DistributedApplication.CreateBuilder(args);

var sql = builder.AddSqlServer("sql")
                 .AddDatabase("db");

var cache = builder.AddRedis("cache");

builder.AddProject&#x3C;Projects.WebApi>("api")
       .WithReference(sql)
       .WaitFor(sql)
       .WithReference(cache)
       .WaitFor(cache)
       .WithExternalHttpEndpoints();

await builder.Build().RunAsync();
</code></pre>
<p>The app host project also includes a <code>.csproj</code> file includes a reference to the Aspire SDK and hosting integrations unique to Aspire. A sample <code>.csproj</code> file for the above app host project might look like this:</p>
<pre><code class="language-xml">&#x3C;Project Sdk="Microsoft.NET.Sdk">

  &#x3C;Sdk Name="Aspire.AppHost.Sdk" Version="9.2.0" />

  &#x3C;PropertyGroup>
    &#x3C;OutputType>Exe&#x3C;/OutputType>
    &#x3C;TargetFramework>net9.0&#x3C;/TargetFramework>
    &#x3C;ImplicitUsings>enable&#x3C;/ImplicitUsings>
    &#x3C;Nullable>enable&#x3C;/Nullable>
    &#x3C;UserSecretsId>{{SomeGuid}}&#x3C;/UserSecretsId>
  &#x3C;/PropertyGroup>

  &#x3C;ItemGroup>
    &#x3C;ProjectReference Include="..\WebApi.csproj" />
  &#x3C;/ItemGroup>

  &#x3C;ItemGroup>
    &#x3C;PackageReference Include="Aspire.Hosting.AppHost" Version="9.2.0" />
    &#x3C;PackageReference Include="Aspire.Hosting.Redis" Version="9.2.0" />
    &#x3C;PackageReference Include="Aspire.Hosting.SqlServer" Version="9.2.0" />
  &#x3C;/ItemGroup>

&#x3C;/Project>
</code></pre>
<p>These two key files make up the heart of Aspire's core feature: the app host.</p>
<h3 id="local-development"><a aria-hidden="true" tabindex="-1" href="#local-development"><span class="icon icon-link"></span></a>Local development</h3>
<p>Aspire provides a local development experience that is similar to how you would run your application in the cloud. This includes support for containerized services and executable workloads traditionally outside the .NET ecosystem. For example, you can run a PostgreSQL database, Java backend, and JavaScript frontend by starting the Aspire app host project with or without any other C# code in your solution. You get a dashboard for visualizing your application stack and a unified way to start and stop the entire distributed system.</p>
<p>With our app host project, we have a new entry point for our local application that can start up entire application stack. With the press of F5 in Visual Studio or running <code>dotnet run</code> against our app host project, all resources defined in the application model will start up and be available for use. For our sample app, this includes the ASP.NET Web API, a SQL Server database, and Redis for caching. The app host project also provides a dashboard for visualizing your application stack and managing the lifecycle of your services.</p>
<p>Our dashboard will look something like this:</p>
<p><img src="/assets/_blog/hello-aspire-breaking-down-key-features/sample_dashboard.png" alt="A screenshot of the .NET Aspire sample app dashboard. The resource table is displayed along with the cache, database, and api resource entries."></p>
<p>This dashboard provides a visual representation of our application stack resources, console logs, and telemetry data for each of the services. It becomes the central UI hub for exploring our local app.</p>
<h3 id="opinionated-service-defaults"><a aria-hidden="true" tabindex="-1" href="#opinionated-service-defaults"><span class="icon icon-link"></span></a>Opinionated service defaults</h3>
<p>Aspire provides a set of default configurations and settings for common cloud-native concerns including instrumentation, monitoring, and service discovery. These defaults are defined once in a new project and then applied to other projects in the solution. You can extend, override, or opt-out of these defaults as needed, but the goal is to provide a set of best practices that you can follow to get started quickly.</p>
<p>The service defaults are a second new project type in your solution. This shared project, commonly named <code>ServiceDefaults</code>, includes a <code>.csproj</code> file and a single <code>Extensions.cs</code> file. The project is then referenced by your application projects. The <code>Extensions.cs</code> file includes default configurations for:</p>
<ul>
<li>
<p><strong>OpenTelemetry</strong>: Adds OpenTelemetry SDK services including logger, metrics, and tracing. Additionally, it configures the OpenTelemetry Protocol (OTLP) exporter for sending data to a collector.</p>
</li>
<li>
<p><strong>Health Checks</strong>: Adds two default health check endpoints: <code>/health</code> and <code>/alive</code>. The former includes predefined checks from hosting integrations, while the latter simply responds if the application is running.</p>
</li>
<li>
<p><strong>Service Discovery</strong>: Turns on service discovery by default with services for dependency injection and in the http client builder.</p>
</li>
<li>
<p><strong>Http Client</strong>: Adds resiliency defaults using Polly for all <code>HttpClient</code> instances.</p>
</li>
</ul>
<p>These defaults are exported as two extension methods <code>AddServiceDefaults</code> and <code>MapDefaultEndpoints</code> that can be called while building your .NET application as shown below:</p>
<pre><code class="language-csharp">var builder = WebApplication.CreateBuilder(args);
builder.Services.AddServiceDefaults();
// ... existing service configuration

var app = builder.Build();
app.MapDefaultEndpoints();
// ... existing app configuration

await app.RunAsync();
</code></pre>
<h3 id="client-integrations"><a aria-hidden="true" tabindex="-1" href="#client-integrations"><span class="icon icon-link"></span></a>Client integrations</h3>
<p>Aspire provides a set of client libraries and SDKs for common cloud-native services, such as SQL Server, Redis, Azure Service Bus, and OpenAI. These libraries are designed to seamlessly integrate your application with dependent services locally and post-deployment. This allows you to focus on writing your application code without worrying about the underlying infrastructure.</p>
<p>Some of the client integrations include:</p>
<ul>
<li>SQL Server</li>
<li>Redis</li>
<li>Azure Service Bus</li>
<li>Azure Blob Storage</li>
<li>Azure OpenAI</li>
<li>Ollama</li>
</ul>
<p>These client integrations are added to your application project via NuGet packages. For example, to add the Entity Framework SQL Server integration you might run the following command:</p>
<pre><code class="language-bash">dotnet add package Aspire.Microsoft.EntityFrameworkCore.SqlServer
</code></pre>
<h3 id="automated-testing"><a aria-hidden="true" tabindex="-1" href="#automated-testing"><span class="icon icon-link"></span></a>Automated testing</h3>
<p>As an orchestrator for your local environment, Aspire can spin up and tear down your entire application stack for automated testing. Using the <code>Aspire.Hosting.Testing</code> package and your existing test framework, you can write integration or functional tests in C# that run against your actualized application in the same environment as you develop in. This allows you to test your application in more realistic scenarios and catch issues earlier in the local developer process instead of waiting for a deployment to a staging or production environment.</p>
<h2 id="conclusion"><a aria-hidden="true" tabindex="-1" href="#conclusion"><span class="icon icon-link"></span></a>Conclusion</h2>
<p>Aspire is an exciting new addition to the .NET ecosystem. It is actively evolving and has a lot of potential to simplify the developer experience locally and in the path to production. If you have not already, I highly encourage you to try adding it to your existing projects or start your next with it. If you are interested in learning more, follow along as I explore the framework in more detail in future posts. I will be diving into specific features such as the application host, extending service defaults, and integration testing. Or, if you cannot wait, check out the <a href="https://learn.microsoft.com/en-us/dotnet/aspire/">Aspire on Microsoft Learn</a> for official documentation.</p>]]></content:encoded>
            <category>aspire</category>
            <category>cloudnative</category>
            <category>dotnet</category>
            <category>testing</category>
            <enclosure url="https://victorfrye.com/assets/_blog/hello-aspire-breaking-down-key-features/banner.jpg" length="0" type="image/jpg"/>
        </item>
    </channel>
</rss>