Work with a REST API using PowerShell

2 min read A well-designed REST API can be consumed and interacted with in many ways. Powershell is one of those really useful ones because it’s very dynamic. We’ll also consider that the API is protected using JWT Bearer tokens by an OpenID Connect server. Our example API, in this case, is a simple REST API to query and manage users.

Set required headers:

Adding the bearer token manually to the script in this case, but this step could be automated as well although it’s a lot more involved to initially set up. See the following guide for one way of doing this https://docs.microsoft.com/en-us/information-protection/develop/concept-authentication-acquire-token-ps

$headers = @{}
$headers["Accept"] = "application/json"
$headers["Authorization"] = "Bearer 3a5e90b25ac028ec968def29d0055d418265e9810968eb4a0c531a45fee3b00f"

Cloudflare Worker Conditional Reverse Proxy

< 1 min read Cloudflare worker to load content from subdomain/alternate location and replace references to subdomain/alternate location.

addEventListener('fetch', event => {
  var url = new URL(event.request.url);
  if (url.pathname.startsWith('/blog') || url.pathname === '/blog') {
    event.respondWith(handleBlog(event, url));
  } else {
    event.respondWith(fetch(event.request));
  }
})

async function handleBlog(event, url) {
  // Load subdomain content / reverse proxy mysite.com/blog to blog.mysite.com subdomain
  var originUrl = url.toString().replace('https://mysite.com/blog', 'https://blog.mysite.com');
  
  // Load content
  let response = await fetch(originUrl);

  // Make sure we only modify text, not images
  let type = response.headers.get("Content-Type") || "";
  if (!type.startsWith("text/")) {
    return response;
  }

  // Read response body
  let text = await response.text();

  // Modify it
  let modified = text.replace(/blog.mysite.com/g, "mysite.com/blog")

  // Return modified response
  return new Response(modified, {
    status: response.status,
    statusText: response.statusText,
    headers: response.headers
  });
}

Managing Multi-Cloud Hybrid Environments with DevOps Tools

11 min read I get asked a lot which cloud provider I prefer, even by people that know me well, and the answer I give lately really surprises them I think. My answer is: a combination of all of them and colocated environments. I think when it comes to the major players in the cloud world namely GCP, Azure and AWS, most of the offerings are pretty much on-par with each other. The preference people have really comes from the trust in the company’s management of the environment, price, friendliness and familiarity of interface; and clear visibility into what’s going on. Well sure, but that’s only good as long as you’re only using one, but in many enterprises, there are really good reasons to use a combination of cloud providers, combined with on-premise and colocated hardware. Some of these reasons include risk of availability in extreme cases of global outages from a single provider (and there are examples of this in the past,) and others include specific niche offerings that are only offered by one provider and only used by a specific department. Let’s take a look at tools some of the top technology companies use to manage their cloud-hybrid environment.

Building a fast and secure blog – Part 4

8 min read

Scanning

The most important things in security and performance, more than anything else I’d say is: measure, measure, measure, and when you have all the info, set up automatic measuring and alerts. We’ve already set up scanning for some basic things like malware, but there’s a lot more to scan for.

SSL / Encryption settings / strength

https://www.ssllabs.com/ssltest/

SSL Server Test from Qualys will test the SSL/TLS configuration of your website, and provide you a lot of details about your encryption capabilities, known vulnerabilities and identify misconfigurations. Using the settings configured so far, your grade should be A+, but that can change as new threats are discovered, so you should check this regularly.

Building a fast and secure blog – Part 3

9 min read

Setting up Cloudflare

Sign up for a free account at https://www.cloudflare.com/.

Upgrading to Pro has some definite benefits

Add your site

As soon as you log in, you have the option of adding your first site

Verify your DNS records

At the next step it will try to detect and import all your existing DNS records. You’ll next be changing your nameservers to use Cloudflare’s nameservers, so make sure all your DNS records are present. There is an option to avoid this if the situation really requires it, and proceed with CNAME records, but you’ll have to reach out to Cloudflare support to discuss those options.

Building a fast and secure blog – Part 1

4 min read I find wordpress to be sufficient for my needs for a blog, so it’s my go-to for a really simple site or blog. If custom logic is needed, it’s a no-go, and it’s all the way custom based on what’s needed. “Right tech for the job.”

In this series I’ll show how to create a simple, fast and security-conscious blog.

Part 1: Hosting / installation

Part 2: Plugins, upgrading PHP, HTTP security headers

Part 3: Caching, WAF and Optimizations

Part 4: Monitoring and performance testing

Hosting / Installation

Azure, AWS and GCP have great free offerings for getting started, and free-tier that’s probably sufficient for small blogs. Wordpress.com could be a good option as well. I prefer Digital Ocean in this case, because I have full control over the VM, it’s really cheap ($6/month), it’s a one-click deploy droplet, and really fast. Digital Ocean also monitors security bulletins and sends me relevant info on vulnerabilities, so I can patch anything that’s needed, and they handle backups seamlessly.

Download and extract gzip tar with PowerShell

3 min read We found ourselves with a requirement to download an updated version of a public dataset on a regular basis, so PowerShell + windows scheduler came to mind, since the application runs in a windows environment. But only to find that PowerShell doesn’t make this quite trivial.   In PowerShell v5+ we have the Expand-Archive command: Expand-Archive c:\a.zip -DestinationPath c:\a but this doesn’t support gzip or tar   gzip is a compression algorithm, and is based on the DEFLATE algorithm, which is a combination of LZ77 and Huffman coding. There’s a…

MSSQL Shrink / Truncate transaction log

2 min read SQL Transaction logs allow you to restore a database to a specific point in time, and is a great option for a production database, but these logs must be backed up frequently enough to prevent them from filing up. However, you need to back up the transaction log itself, not just the database, if you want to use the Full or Bulk Logged recovery models. This article describes the right way to set this up: https://docs.microsoft.com/en-us/sql/relational-databases/backup-restore/back-up-a-transaction-log-sql-server?view=sql-server-2017 

If you’ve already taken a full backup of your database, and find yourself in a crunch out of space, the following is a quick way of clearing the transaction log to recover space:

Log into Microsoft SQL Server Management Studio…

Installing and configuring SonarQube with Azure DevOps/TFS

8 min read Our team follows a process adapted from Microsoft’s Release Flow [see: https://blogs.msdn.microsoft.com/devops/2018/04/19/release-flow-how-we-do-branching-on-the-vsts-team/], in which we create a branch off develop (our long-running mostly-stable product), do our work, commit it (with the PBI/Bug number in a comment), push the branch, then go into TFS and create a pull request. TFS will suggest a shortcut link to create a PR for the branch you just pushed to your default branch (in our case develop); or you can click the New pull request button and choose your source and target branches.