I stumbled accross the Cloud Resume Challenge by accident. Somebody had posted on linkedin about the things they learned by doing this challenge. They were talking about some very cool things so I decided to check it out. I really liked what I saw, and thus I embarked on this journey.

What is the Cloud Resume Challenge?

This challenge is really just a self-guided curriculum. By performing each of the steps of the challenge you are checking off a list of useful skills related to creating, deploying, and maintaining cloud-based tools. I am already quite proficient in the hardware side of things. I have deployed physical servers for my test lab at Xbox. But that is only one small part of the cloud. The cloud fascinates me and is only going to become more important as more and more of the global population gains access to the internet. When I discovered this challenge I was delighted to discover that it was just a guided self-learning curriculum. The absolute best part of the whole challenge? It's what you have to show for it when you are done. Once you complete the challege you have a practical public facing portfolio piece. A showcase of everything you have learned along the way that anybody in the world can see. Ars Technica also released an article about how they host their website. After learing more about the Cloud Resume challenge a lot of the concepts they outlined there sounded very familiar. It was proof that the skills imparted through the challenge were valid in the industry. Here is the challenge:

  • 1 - Certification
  • Your resume needs to have a Cloud Platform certification on it. Such as the AZ-900 certification, the AWS Cloud Practitioner certification, or the Google Cloud Digital Leader certification.
  • 2 - HTML
  • Your resume needs to be written in HTML. Not a Word doc, not a PDF.
  • 3 - CSS
  • Your resume needs to be styled with CSS. No worries if you're not a designer - neither am I. It doesn't have to be fancy. But we need to see something other than raw HTML when we open the webpage.
  • 4 - Static Website
  • Your HTML resume should be deployed online as an Azure Storage static website. Services like Netlify and GitHub Pages are great and I would normally recommend them for personal static site deployments, but they make things a little too abstract for our purposes here. Use Azure Storage.
  • 5 - HTTPS
  • The Azure Storage website URL should use HTTPS for security. You will need to use Azure CDN to help with this.
  • 6 - DNS
  • Point a custom DNS domain name to the Azure CDN endpoint, so your resume can be accessed at something like my-c00l-resume-website.com. You can use Azure DNS or any other DNS provider for this. A domain name usually costs about ten bucks to register.
  • 7 - Javascript
  • Your resume webpage should include a visitor counter that displays how many people have accessed the site. You will need to write a bit of Javascript to make this happen. Here is a helpful tutorial to get you started in the right direction.
  • 8 - Database
  • The visitor counter will need to retrieve and update its count in a database somewhere. I suggest you use the Table API of Azure's CosmosDB for this. (Use serverless capacity mode for the database and you'll pay essentially nothing, unless you store or retrieve much more data than this project requires.)
  • 9 - API
  • Do not communicate directly with CosmosDB from your Javascript code. Instead, you will need to create an API that accepts requests from your web app and communicates with the database. I suggest using Azure Functions with an HTTP trigger for this. They will be free or close to free for what we are doing.
  • 10 - Python
  • You will need to write a bit of code in the Azure Function; you could use more Javascript, but it would be better for our purposes to explore Python - a common language used in back-end programs and scripts - and its Azure SDK. Here is a good, free Python tutorial.
  • 11 - Tests
  • You should also include some tests for your Python code.
  • 12 - Infrastructure as Code
  • You should not be configuring your API resources - the Azure Function, the CosmosDB - manually, by clicking around in the Azure console. Instead, define them in an Azure Resource Manager (ARM) template on a Consumption plan. This is called “infrastructure as code” or IaC. It saves you time in the long run.
  • 13 - Source Control
  • You do not want to be updating either your back-end API or your front-end website by making calls from your laptop, though. You want them to update automatically whenever you make a change to the code. (This is called continuous integration and deployment, or CI/CD.) Create a GitHub repository for your backend code.
  • 14 - CI/CD (Back end)
  • Set up GitHub Actions such that when you push an update to your ARM template or Python code, your Python tests get run. If the tests pass, the ARM application should get packaged and deployed to Azure.
  • 15 - CI/CD (Front end)
  • Create a second GitHub repository for your website code. Create GitHub Actions such that when you push new website code, the Azure Storage blob automatically gets updated. (You may need to purge your Azure CDN endpoint in the code as well.) Important note: DO NOT commit Azure credentials to source control! Bad hats will find them and use them against you!
  • 16 - Blog post
  • Finally, in the text of your resume, you should link a short blog post describing some things you learned while working on this project. Dev.to or Hashnode are great places to publish if you don't have your own blog.
Sounds neat, or so I thought anyway. But what's with all of the Azure junk?

There are 3 different versions of the Cloud Resume Challenge. One for Azure, one for the Google Cloud, and one for AWS. The author tailors each challenge to the specific tools that each of these platforms use in order to become an expert in whichever one you think is the most exciting. Except... I am hipster trash and I don't want to use any cloud service where I will have to pay a FAANG company any of my hard earned money! Money that I earned by working for Microsoft. So, I decided that I will not be following any of the specific challenges. I will instead make my own. I will be deploying my resume to the Dfinity Internet Computer! My challenge resume will likely be the only one hosted as a dApp and the hosting costs paid for exclusively by the rewards that are earned from the governance tokens I have staked into the IC Network Nervous System. Does this sort of defeat the purpose of the challenge a little? Perhaps. But perhaps it doesn't matter because this is what I piqued my curiosity the most and would motivate me to learn more than just following instructions for some other more boring platform. If I can learn one platform, I can learn another.

Step 1 - Certification

I did skip this one. I will likely just do some of the CompTIA Certifications. In particular, I think will focus on Server+.That is where my interest really lies, in the hardware. I should probably even get my A+ Cert at some point?. Though I have been professionally repairing computers for Xbox for over 7 years by this point. Updates will appear here when I decide to go through with this.

Step 2 and 3 - HTML/CSS

I wanted more than just a resume. I wanted a website. I looked through many templates and such, but in the end there was only one conclusion. To build my own from scratch. Not only was that the intent of the challenge, it was also the best way for me to learn how to use it and change it. To build is to know. It would make me come up with my own way to display my accomplishments and it would be an accomplishment in itself. I came up with a plan. A homepage that could host both text posts and image galleries. This could showcase my resume, coding projects, writing projects, and physical projects all in one. In order to do this it would need to be modular. I needed many formats and templates for pretty much any kind of content that I wanted to host. I did already have some experience in this field. First in a middle school classroom. Then in my first job post-college where I used HTML/CSS and javascript to create webforms to gather testing data from employees testing X-Ray Receptors fresh off the factory line. Then I used HTML/CSS and PHP to create advertisements and emails for a company that wanted to blast special bulk discount deals to clients. Those were years ago however and I needed to knock some of the rust off. I began with the W3 schools tutorials. Then some CodeAcademy courses. I started from the beginning to re-establish my fundamentals. I put together the basics of my website using this alone. Then it came time for fine-tuning and troubleshooting. I had become proficient enough to execute the image of my plan, but making it work correctly was just taking so long. There is nothing quite like a Large Language Model to get you out of a jam. I discovered the coding power of AI. It was like having a college professor at my side at all times. I could ask 'What are all the possible parameters of this function' and I would recieve a list of all possible inputs along with some examples of how to use them. I could ask 'There is a certain element on my webpage that is not behaving as expected in these ways. What could cause this?' and I would get 3 different possibilities that I would then go and investigate. These AI tools did not create the webstie for me. That would actually defeat the purpose of this challenge anyway. But, what it did do was shorten the amount of time I needed to pull it off. AI tools are just a Rubber Ducky that can talk back. I have dabbled in Rubber Ducky debugging in the past. But, a Rubber Ducky could never offer you a novel solution or point out that you missed a semicolon. A rubber ducky with the power of every written word on the internet behind it, is a completely different beast. I can input a question about a bug. For instance 'why are the font sizes different in these two containers when they should not be?' It will then give me three possible causes for the error for me to investigate. Sometimes all of those suggestions are wrong but it still very helpfully gave me a starting point to start eliminating possibilities. But other times one of those suggestions is correct and it likely saved me some time tracking it down myself without support. With the support of my LLM rubber ducky I came up with a modular and modern HTML/CSS static website. One where I spent waaaaay too much time on making it responsive to any screen size and to light and dark mode preferences. Go ahead and change your color preference on your device, you'll see that the whole website changes! I began with the idea that this website would use PHP to be more dynamic and the modularity mostly stemmed from that. But for now a Static Website is the goal of the challenge and a much cheaper option anyway.

Step 4 - Static Website

So here I was with the makings of my very own website. Of my own design and hand-made from scratch. Hosted on my very own NAS accessible from anywhere... in my house. I needed to get this hosted online. Now it was time for me to learn more about the Internet Computer. I have already been following the IC ever since it's launch. It launched during the Crypto fervor of 2021. It would be easy to mistake the Internet Computer and it's governance token ICP as another crypto fad like NFT art. But when I learned about the Internet Computer, I was smitten. Before you can learn about the IC though lets go through a bit of crypto history. Once the hype had died down only the useful tools remained. For the most part anyway, and I was hoping that the Internet Computer would be one of them. First crypto was just a currency. Bitcoin was and is just a way to transfer value from one place to another without any kind of intermediary. One party could transfer any amount of BTC to any other party in the world and the BTC blockchain would verify the transaction and anyone in the world could verify that the transaction had taken place and was valid. Then came Etherium and it's capability to perform Smart Contracts. Complex programs and agreements could be executed securely and autonomously anywhere in the world with the same technology that powered Bitcoin. People can buy spare HDD storage space anonymously from a stranger that would anonymously rent out that HDD space. The provider is rewarded with a token called STORJ. People could rent GPUs for rendering large video projects from anyone in the world with a spare GPU and pay for it with a token called RNDR. People could install a weather station in their backyard and be rewarded with a token called WXM, they could then sell that WXM token to anyone that wanted to access a worldwide database of realtime weather data. All of these things were crypographically secured, transactions were publicly verifiable via a public ledger, and they all operated autonomously. On the surface, Cryptocurrency just seems like magic internet money, but in reality it is an autonomous platform for buying and selling services. Services that FAANG would love to have you believe can only be done by them and only be done by completely giving up the right to keep your information private. And definitely only if you pay a monthly subscription. The Internet Computer is a decentralized blockchain platform developed by the DFINITY Foundation. It aims to provide a secure and scalable environment for building and deploying a new generation of dApps (decentralized applications) and services on the internet. The Internet Computer seeks to revolutionize the way applications are hosted, run, and interacted with on the web. At least that is how Dfinity describes it. I think they are right. I like Data Centers. They are large and high tech constructs packed to the brim with cutting edge technology. They house so much raw compute power inside of them than they can accomplish just about anything. They can be used to store all of the data that the US Government has dredged out of it's citizens. They can play a game for you and transmit it to your house. They can render movies and stream them directly to your home TV. They can calculate a procedurally generated galaxy and let players fly pretend spaceships around in them. They can power an AI that has read every word ever written by human hands and remember all of it. A single data center can do all of these things at once. Maybe even someday they can simulate a human brain. I love that. I love them. I want one. But you can't just buy one. It'll cost you almost a Trillion USD to build one. That's where the Internet Computer comes in. What they are actually doing is providing the same services as a Data Center. But they don't even have one. They let just about anyone provide a tiny amount of data center power and then they combine them all into one monsterous virtual data center. A competitor to AWS and Azure. One that is distributed and autonomous. People from all over the world can provide the server nodes that this data center runs on. It's not just one data center, it is all of them. It used to be that within a data center the virtual machines inside could scale up and down to match it's clients needs. The Internet Computer is an entire data center that can automatically scale up and down to meet it's clients needs. That was a really long way to say that the Internet Computer is a public compute platform that can host a website on the public internet in much the same way as hosting 'serverless' content on Azure or AWS. But, Azure, and AWS demand monthly subscriptions. I despise subscriptions. I planned to host my website on the IC forever with a single up front payment. The reason I could do this is because the Network Nervous System that controls the IC. The NNS controls the IC by accepting and implementing proposals for changes to the IC. Proposals such as sofware updates, adding or removing nodes, disbursing payments to node providers, et cetera. Then all stakeholders vote wether to accept or to reject the proposals. The voters are then rewarded with ICP tokens. A stakeholder is anyone that has 'staked' ICP into a voting neuron. So all I had to do was pay an upfront fee for a small cache of ICP. I then staked all of the ICP into the NNS and told that staked 'neuron' to follow the votes of the Dfinity Foundation. Whenever it votes on proposals I get rewarded with more ICP tokens which can be turned into 'Cycles'. Cycles are how you pay for hosting services on the IC. I have been building up a small buffer of rewards for some time now and it's time to turn them into web hosting. The Dfinity Foundation has some decent documentation on developing for the IC. In fact they had a literal guide on exactly what I wanted to do! They had an SDK and everything. Super easy to just load it up on a Linux or macOS and go to town. This is where the Windows Subsystem for Linux comes in. I installed Ubuntu onto my Windows dev machine and just like that I can run the Dfinity SDK! After that the IC SDK really does make it pretty simple. They have pretty friendly documentation, and even better they have an AI chatbot that can read the documentation for you! I made an Internet Identity for managing the website. Added some Cycles to it from the Cycles Faucet. They conveniently offer about $15 worth of Cycles for any new developer to get started. I uploaded my static website into an IC canister and that was it and added the Cycles from the faucet to that same canister. Suddenly, step 4 was complete!

Step 5 - HTTPS

This step was taken care of in the previous step, when deploying to the IC. This is because the Internet Computer's infrastructure, including its boundary nodes, enforces HTTPS for all communication.

Step 6 - DNS

The next step is a custom domain name. The Internet Computer uses it's boudary nodes as an internal router, directing traffic to the appropriate node. So I would need to register the website domain name on both an internet DNS like Cloudflare and the IC boundary nodes. I knew very little about this process so I had to look up some documentation and some guides. But, eventually I got there. You know what I learned during this process? I learned patience, lots of patience. Oh, and never to trust Google. First I made the mistake of registering my domain using Google Domains. What's that you say? There is no such thing as Google Domains? Well it used to. It became just another headstone in the graveyard of Google projects very shortly after I decided to use it out of convenience. Google Domains transferred all domains to SquareSpace. So first I had to wait until Google migrated the domain. Then I had to change the DNS settings on SquareSpace. Meanwhile I also had to edit my IC canister to register the domain name with the IC boundary nodes. Then I read in the documentation that it needs to validate with the Cloudflare DNS servers before registration. So now I had to wait for those DNS settings to propagate through the whole world before I could even know if it worked. Which it did not. What was funny is that, the Internet Computer documentation instructions for various DNS providers had a nugget of wisdom. Under the instructions for how to point the your DNS provider to the IC boundary nodes you can find this:

Rely on an alternative DNS provider (recommended) It is explained in this approach using Cloudflare as DNS provider. It works similar with any other DNS provider that supports CNAME, ALIAS, or ANAME records for the apex of a domain.

Part of the official instructions explain that you should just use Cloudflare. This little nugget of wisdom would have been very useful before I decided to rely on Google Domains and definitely before they forced me to SquareSpace. So after SquareSpace did not work for me I transferred my domain to Cloudflare, where I should have been this whole time. Then more waiting as the domain was transferred and then the DNS settings propagated again throughout all of the worlds DNS servers. Then it was the longest game of cat and mouse I have every played as it would not work and then I would make a change. Deploy the change. Then have to wait 48 hours before knowing if the change had worked.

Step 7 - Javascript

This step involves using Javascript to add a visitor counter to the website/resume. I learned some in college and used it once or twice at jobs before. But that was 15 years ago at this point. To do this I was going to have to relearn quite a bit. It was the same process as using HTML/CSS. A little free code academy here, a touch of LLM assistance there and I'll have your code ready for all I plan to do.