In the last article I dicovered the Cloud Resume Challenge. The first part of that challenge is choosing which Cloud Provider to host your resume on and gaining certifications in those technologies. I chose not to use any of the ones listed on the official challenge? I submit that it does not mattet, it's about learning and pushing yourself. So the next steps involve actually building the website.
Cloud Resume Challenge - Part 2 Building the Website Step 1 - CertificationI did skip this one for now. I didn't pick an official cloud provider so they don't have an official certification. I will probably get to this later. Updates will appear here as I go through this process.
Step 2 and 3 - HTML/CSSI wanted more than just a resume. I wanted a website. I looked through many templates and such, but in the end there was only one conclusion. To build my own from scratch. Not only was that the intent of the challenge, it was also the best way for me to learn how to use it and change it. To build is to know. It would make me come up with my own way to display my accomplishments and it would be an accomplishment in itself. I came up with a plan. A homepage that could host both text posts and image galleries. This could showcase my resume, coding projects, writing projects, and physical projects all in one. In order to do this it would need to be modular. I needed many formats and templates for pretty much any kind of content that I wanted to host. I did already have some experience in this field. First in a middle school classroom. Then, in my first job post-college where I used HTML/CSS and javascript to create webforms to gather testing data from employees testing X-Ray Receptors fresh off the factory line. Then I used HTML/CSS and PHP to create advertisements and emails for a company that wanted to blast special bulk discount deals to clients. Those were years ago however and I needed to knock some of the rust off. I began with the W3 schools tutorials. Then some CodeAcademy courses. I started from the beginning to re-establish my fundamentals. I put together the basics of my website using this alone. Then it came time for fine-tuning and troubleshooting. I had become proficient enough to execute the image of my plan, but making it work correctly was just taking so long. About this time Large Language Models were having a moment. I discovered the coding power of AI. It was like having a college professor at my side at all times. I could ask 'What are all the possible parameters of this function' and I would recieve a list of all possible inputs along with some examples of how to use them. I could ask 'There is a certain element on my webpage that is not behaving as expected in these ways. What could cause this?' and I would get 3 different possibilities that I would then go and investigate. These AI tools did not create the webstie for me. That would actually defeat the purpose of this challenge anyway. What they did was shorten the amount of time I needed to pull it off. AI tools are just a Rubber Ducky that can talk back. A rubber ducky with the power of every question every asked on Stack Exchange behind it, is a completely different beast. With the support of my LLM rubber ducky I came up with a modular and responsive HTML/CSS static website. Go ahead and change your color preference on your device, you'll see that the whole website changes! I began with the idea that this website would use PHP to be more dynamic and the modularity mostly stemmed from that. But for now a Static Website is the goal of the challenge and a much cheaper option anyway.
Step 4 - Static WebsiteSo here I was with the makings of my very own website. Of my own design and hand-made from scratch. Hosted on my very own NAS accessible from anywhere inside my house. I needed to get this hosted online. Now it was time for me to learn more about the Internet Computer. I had been following the IC ever since it's launch during the Crypto fervor of 2021. It would be easy to mistake the Internet Computer and it's governance token ICP as another crypto fad like NFT art. But when I learned about the Internet Computer, I was smitten. Before you can learn about the IC though lets go through a bit of crypto history. First, cryptocurrency was just a currency. Bitcoin allowed peer-to-peer value transfer without intermediaries or banks. Anyone could send BTC anywhere in the world, with the blockchain verifying and publicly recording the transaction. Then came Ethereum, introducing Smart Contracts—self-executing agreements and programs that operated securely and autonomously using blockchain technology. With crypto, new possibilities emerged: renting spare HDD space anonymously with STORJ tokens, accessing global GPU power for video rendering with RNDR tokens, or installing a backyard weather station to earn WXM tokens, which could be sold for access to real-time weather data, or companies and governments could get access to high quality aerial imagery from drone operators earning Spexi tokens. All of this was cryptographically secured, publicly verifiable, and fully autonomous. Cryptocurrency, often dismissed as "magic internet money," has become an autonomous marketplace for services, challenging the monopoly of FAANG corporations, which demand personal data and monthly fees for similar offerings. The Internet Computer, developed by the DFINITY Foundation, takes this further. It's a decentralized blockchain platform designed to securely and scalably host a new generation of decentralized applications (dApps). By revolutionizing how web applications are built and used, it offers a vision of the future where centralized control is no longer necessary. At least, that's how DFINITY describes it—and I think they're right. I like Data Centers. They are large and high tech constructs packed to the brim with cutting edge technology. They house so much raw compute power inside of them than they can accomplish just about anything. They can be used to store all of the data that the US Government has dredged out of it's citizens. They can play a game for you and transmit it to your house. They can render movies and stream them directly to your home TV. They can calculate a procedurally generated galaxy and let players fly pretend spaceships around in them. They can power an AI that has read every word ever written by human hands and remember all of it. A single data center can do all of these things at once. Maybe even someday they can simulate a human brain. I love that. I love them. I want one. But you can't just buy one. It'll cost you almost a Trillion USD to build one. That's where the Internet Computer comes in. What they are actually doing is providing the same services as a Data Center. But they don't even have one. They let just about anyone provide a tiny amount of data center power and then they combine them all into one monsterous virtual data center. A competitor to AWS and Azure. One that is distributed and autonomous. People from all over the world can provide the server nodes that this data center runs on. That was a really long way to say that the Internet Computer can host a website on the public internet in much the same way as hosting 'serverless' content on Azure or AWS. But, Azure, and AWS demand monthly subscriptions. I despise subscriptions. I planned to host my website on the IC forever with a single up front payment.This was made possible because of the Network Nervous System that controls the IC. The NNS controls the IC by accepting and implementing proposals for changes to the IC. Proposals such as sofware updates, adding or removing nodes, disbursing payments to node providers, et cetera. Then all stakeholders vote wether to accept or to reject the proposals. The voters are then rewarded with ICP tokens. A stakeholder is anyone that has 'staked' ICP into a voting neuron. So all I had to do was pay an upfront fee for a small cache of ICP. I then staked all of the ICP into the NNS and told that staked 'neuron' to follow the votes of the Dfinity Foundation. Whenever it votes on proposals I get rewarded with more ICP tokens which can be turned into 'Cycles'. Cycles are how you pay for hosting services on the IC. I have been building up a small buffer of rewards for some time now and it's time to turn them into free web hosting! The Dfinity Foundation has some decent documentation on developing for the IC. In fact they had a literal guide on exactly what I wanted to do! They had an SDK and everything. Super easy to just load it up on a Linux or macOS and go to town. This is where the Windows Subsystem for Linux comes in. I installed Ubuntu onto my Windows dev machine and just like that I can run the Dfinity SDK! After that the IC SDK really does make it pretty simple. They have pretty friendly documentation, and even better they have an AI chatbot that can read the documentation for you! I made an Internet Identity for managing the website. Added some Cycles to it from the Cycles Faucet. They conveniently offer about $15 worth of Cycles for any new developer to get started. I uploaded my static website into an IC canister and added the Cycles from the faucet to that same canister. Suddenly, step 4 was complete!
Step 5 - HTTPSThis step was taken care of in the previous step, when deploying to the IC. This is because the Internet Computer's infrastructure, including its boundary nodes (discussed in the next step), automatically enforces HTTPS for all web traffic.
Step 6 - DNSThe next step is a custom domain name. The Internet Computer uses it's boudary nodes as an internal router, directing traffic to the appropriate node. So I would need to register the website domain name on both an internet DNS like Cloudflare and the IC boundary nodes. I knew very little about this process so I had to look up some documentation and some guides. But, eventually I got there. You know what I learned during this process? I learned patience, lots of patience. Oh, and never to trust Google. First I made the mistake of registering my domain using Google Domains. What's that you say? There is no such thing as Google Domains? Well it used to exist. It became just another headstone in the graveyard of Google projects very shortly after I decided to use it out of convenience. Google Domains transferred all domains to SquareSpace. So first I had to wait until Google migrated the domain. Then I had to change the DNS settings on SquareSpace. Meanwhile I also had to edit my IC canister to register the domain name with the IC boundary nodes. Then I read in the documentation that it needs to validate with the Cloudflare DNS servers before registration. So now I had to wait for those DNS settings to propagate through the whole world before I could even know if it worked. Which it did not. What was funny is that, the Internet Computer documentation for various DNS providers had a nugget of wisdom. Under the instructions for how to point the your DNS provider to the IC boundary nodes you can find this:
Rely on an alternative DNS provider (recommended) It is explained in this approach using Cloudflare as DNS provider. It works similar with any other DNS provider that supports CNAME, ALIAS, or ANAME records for the apex of a domain.
This step involves using Javascript to add a visitor counter to the website/resume. I learned some in college and used it once or twice at jobs before. But that was 15 years ago by this point. To do this I was going to have to relearn quite a bit. It was the same process as using HTML/CSS. A little free code academy here, a touch of LLM assistance there and I'll have my code ready for all I plan to do. This is a fairly straightforward thing to do, and a quick way to learn how to implement scripts into a static website. There are numerous tutorials on how to do this. So, in short order I had a bar at the top of every page showing the number of times a user has loaded the page. The number was saved locally in a cookie with a localStorage fallback.
// Get a cookie value by its name
function getCookie(name) {
const cookies = document.cookie.split('; ').reduce((acc, cookie) => {
const [key, value] = cookie.split('=');
acc[key] = value;
return acc;
}, {});
return cookies[name] || null;
}
// Create or update a cookie
function setCookie(name, value, days = 3650) {
const expires = new Date();
expires.setDate(expires.getDate() + days); // Set expiration date
document.cookie = `${name}=${value}; expires=${expires.toUTCString()}; path=/; SameSite=Strict`;
}
// Increment the hit counter for the current page
function updateHitCounter() {
// Use the full page path as part of the cookie name
const pageName = window.location.pathname
.replace(/^\//, '') // Remove leading slash
.replace(/\//g, '_') || 'homepage'; // Replace slashes with underscores
const cookieName = `LBL_pageCount_${pageName}`;
// Retrieve the current counter value for this page, or default to 0
let count = parseInt(getCookie(cookieName)) || 0;
count++; // Increment the counter
// Update the cookie with the new count
setCookie(cookieName, count);
// Update the visible counter on the page if the element exists
const hitCounterElement = document.querySelector('.hitCounter');
if (hitCounterElement) {
hitCounterElement.innerText = count;
}
}
// Initialize the hit counter on page load
window.addEventListener('load', updateHitCounter);
That's pretty much that for the Static part of my Static Website. The next steps involves databases and other more dynamic features of the cloud. This is pretty much the end everything that I was really confident in. Ahead are a lot of technologies that I have never used before. Continue the jorney in Part 3 here: Cloud Resume Journey - Part 3