Monday, March 23, 2020

CapRover deployment controlled database migration

I started using CapRover recently for running first party and third party web services. CapRover lets me set up apps with HTTPS enabled, certificates automatically issued by Let's Encrypt, and I can scale instances up or down any time, on a single machine or even across multiple machines.

As of version 1.6.1 I have not found any built-in functionality that supports more complex deployments, e.g. when you have multiple app instances and you need to migrate a database before deploying a new version of the app.

I thought about writing a pre-deploy script but I spent a few days exploring other solutions first, including trying other PaaS like Dokku, but after all I like the convenience of CapRover. Then I came over a comment in one of their issues on GitHub which gave me enough hints to get started on my own solution.

What my script does:

  1. Sets the instance count of the app to 0 to avoid the old version of the app from accessing the database during or after the migration.
  2. Runs database migration by appending --migrate-database to the entrypoint specified in my Dockerfile. The app understands this. For your information, the app is an ASP.NET Core Web API app that uses Entity Framework Core. The new Docker container is also automatically removed.
  3. Passes in all of the environment variables specified for the app and uses the same network so that the database server can be reached at srv-captain--*.

Next I'll have to think about how to back up the database before migration. This can however work well enough for rapid deployment to a test/staging environment.

var preDeployFunction = async function (captainAppObj, dockerUpdateObject) {
    const DockerApi = require("./built/docker/DockerApi");
    const api = new DockerApi.default();

    const setServiceInstances = async (service, count) => {
        const inspection = await service.inspect();
        const updateObject = { ...inspection.Spec, Mode: { Replicated: { Replicas: count } }, version: inspection.Version.Index };
        await service.update(updateObject);

    const run = async args => {
        const imageName = dockerUpdateObject.TaskTemplate.ContainerSpec.Image;
        const env = => kv.key + "=" + kv.value);
        const config = { Env: env, HostConfig: { AutoRemove: true, NetworkMode: captainAppObj.networks[0] } };

        const {output} = await, args, process.stdout, config);

        if (output.StatusCode !== 0) {
            throw new Error(`Failed to run image ${imageName} with args ${args} (status code ${output.StatusCode}).`);

    const service = api.dockerode.getService(dockerUpdateObject.Name);
    await setServiceInstances(service, 0);
    await run(["--migrate-database"]);
    dockerUpdateObject.version = (await service.inspect()).Version.Index;

    return dockerUpdateObject;

Friday, August 7, 2015

My opinions about Windows 10


  • Slightly enhanced command prompt.
  • Windows 7 backup solution is back, but perhaps something new and better would be good.
  • Multiple real desktops and switching.
  • Snapping windows to corners.
  • Snapping windows to the left or right side makes the window fill the remaining space; this is nice but now it's harder to fill only half of the screen. Edit: After the last cumulative update, the control key actually does modify the behavior. I was sure it did not – at least before the update.
  • Easier to adjust audio volume precicely; not because the resolution is higher, but because the control is larger.


  • "MBR error 1" on first and consequtive startups after upgrade; solved using commands:
    bootrec /FixMbr
    bootrec /FixBoot
  • Make sure to uninstall Microsoft Security Essentials unless you want to waste time upgrading again.
  • New Photo viewer:
    • Zooming and panning is blurry and blocky, especially while panning.
    • Browsing pictures shows rectangles with a solid color instead of the pictures themselves.
  • Cannot remove pictures (from the UI list) that were previously set on the lock screen.
  • Cannot remove custom high contrast themes. Speaking of that, the amount of options is poor.
  • The font in certain textboxes (particularly search box in Explorer) makes it harder to read.
  • Fine-tuning color of taskbar without changing the color of other things as well seems to be impossible.
  • I have turned off animations globally in Windows, but all modern UIs still animate stuff, including the start menu.
  • Start menu:
    • Documents folder on the start menu is now by default an extra click away.
    • No way to easily remove multiple (live and app) tiles I don't want on the start menu.
    • Noticeable delay before the start menu shows up.
    • Context menu for search results (files, desktop apps) in the start menu is crippled (no shell menu entries).
    • Cannot drag and drop any search results (files, desktop apps) from the start menu to any control/toolbar that accepts dropped files.
  • No distance between border/edge and text in command prompt.
  • [Right click taskbar > Properties] opens "Taskbar and Start Menu Properties", but there is nothing for the start menu there.
  • Command prompt still does not display unicode characters.
  • Harder to see what part of a window is draggable (e.g. caption) because everything is a single color.
  • Disabling Windows Defender is harder (use gpedit.msc).
  • Mixed quality icons (amount of color, proportions, design). 3D or 2D, few colors or many, few details or many... Particularly not a fan of the new shield icon.
  • No shortcut keys for switching to specific desktops (e.g. desktop #1-4)?
  • Default privacy settings are intrusive.
  • I somehow managed to record a "game", but actually I was just my web browser. When I try to view my recordings, the app just closes after a few seconds (maybe crashes).
  • After hibernation or locking the screen, the logon/lock screen does not get ready for input when pressing down the shift-key.
  • The File Explorer window sometimes flashes as if it closes and re-opens, and the current selection clears. Workaround: Do not open new windows in separate processes.
  • After closing the File Explorer, it sometimes re-opens immediately afterwards. Workaround: Do not open new windows in separate processes.
  • While doing something with the start menu and/or the File Explorer, I got an error message from RuntimeBroker.exe: Not implemented
  • The UI language in the OS is English, and my region settings are set to Norwegian/Norway; yet, the modern apps are actually in a completely different language (Japanese). Maybe because I once upon a time set the region of my Xbox 360 to Japan.

Wednesday, March 5, 2014

Error when starting VS 2010: The 'Microsoft.TeamFoundation.Client.ServicesHostPackage, Microsoft.VisualStudio.TeamFoundation.TeamExplorer, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' package did not load correctly

I tried to open Visual Studio 2010, and was met with the following error message:

Microsoft Visual Studio
The 'Microsoft.TeamFoundation.Client.ServicesHostPackage, Microsoft.VisualStudio.TeamFoundation.TeamExplorer, Version=, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' package did not load correctly.

The problem may have been caused by a configuration change or by the installation of another extension. You can get more information by running the application together with the /log parameter on the command line, and then examining the file 'C:\Users\sl\AppData\Roaming\Microsoft\VisualStudio\10.0\ActivityLog.xml'.

Continue to show this error message?
Yes   No   

The computer used to run the following software:

  • Team Foundation Server 2010 (probably with SP1)
  • Visual Studio 2010 with SP1

Since then, I've done the following:

  • Uninstalled TFS 2010 and everything related, and upgraded to TFS 2013.
  • Kept VS 2010, and installed VS 2013 side-by-side.

My problem was solved after I did the following:

  • Installed Team Explorer from the TFS 2010 disc image.
  • Reapplied VS 2010 SP1.

Monday, February 3, 2014

NuGet install fails: "Unable to read package from path '[snip].nupkg'"

In my case, this issue came almost out of nowhere. The package installation/restoration succeeded using a regular Windows account, but failed under the SYSTEM account. You've probably verified permissions and what not, so I'll get to the solution to my specific problem: The NuGet cache was corrupted.

To verify that this is the case for you, delete the "packages" folder in your solution folder, then attempt to restore/install packages again, but add the "-NoCache" argument to your command line to prevent NuGet from using the local cache. See NuGet command line reference.

Once you verify that the cache is the problem, locate the cache folder, then simply delete the folder.

Restore/Install packages once more to verify success!

NuGet cache locations

Regular user accounts

%LOCALAPPDATA%\NuGet\Cache (C:\Users\<user>\AppData\Local\NuGet\Cache)

SYSTEM account


Friday, October 18, 2013

Error: The target "_WPPCopyWebApplication" does not exist in the project

This applies to both Visual Studio 2010 and Team Foundation Server (TFS) 2010.

In my case, I had two projects of the follow types that needed to be published on the file system after build:

  • WCF Web Service Library


Build succeeds in the development environment, but fails on the CI/build server, giving the following error message:

The target "_WPPCopyWebApplication" does not exist in the project


Certain files do not exist unless you've installed Visual Studio on your build server.

The "good practice"-way to solve it

You can install Visual Studio on your build server to fix this easily. If you worry about maintaining files, I recommend simply installing Visual Studio and forgetting about it.

Now for the other method

On your development computer with Visual Studio installed, open the folder %ProgramFiles(x86)%\MSBuild\Microsoft\VisualStudio\v10.0.

For web development, two folders are required:

  • Web
  • WebApplications

Copy these folders to the same place on the build server.

Your awesome programs should be building and publishing properly now, but read on if the problem persists.

Files copied but still doesn't work?

In your project file (*.csproj), make sure to import the right .target file before trying to use "_WPPCopyWebApplication".

Here's example; this code automatically publishes a web application/service on the file system, after being buildt:

  <Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" />
  <Target Name="AfterBuild">
    <MSBuild Condition="'$(hasBuilt)' == ''" Projects="$(ProjectPath)" Properties="hasBuilt=true;Platform=$(Platform);Configuration=$(Configuration);WebProjectOutputDir=published" Targets="ResolveReferences;_WPPCopyWebApplication" />

In case you wonder where to put this, then somewhere at the bottom should be fine.

Check list
  1. Import Microsoft.WebApplication.targets, not Microsoft.Web.Publishing.targets. The latter will be imported automatically, so specify only the first one.
  2. Use $(MSBuildExtensionsPath32), not $(MSBuildExtensionsPath). The files don't exist under $(MSBuildExtensionsPath).

Decent reasons to only copy the files

  • Visual Studio 2013 was released just about a day ago, and I don't think we'll see (m)any updates for VS 2010.
  • The difference in size is several gigabytes (VS 2010) vs less than one megabyte (copy the files).


There are at least two ways to solve the problem; the right one depends on your needs.

If you install Visual Studio, the pro is that you never have to think about the files again ‒ not even when updates are released. The obvious con to copy only the necessities, is potential maintenance.

With all this in mind, I choose to copy just the files. They take less than 1 MB compared to several GB, and I think it's fair considering the age of Visual Studio 2010. If the problem persists with VS 2012 and 2013, I'm not certain that I would make the same decision at time of writing.

Wednesday, July 17, 2013

CyaSSL and 4096-bit certificates

As of version 2.6.0, CyaSSL now uses the fastmath library by default (versus the big integer library) when building with the ./configure system.

One of the less portable aspects of fastmath is the need for fixed buffers to reduce dynamic memory use. By default, these buffers allow a 2048 bit X 2048 bit multiply into a 4096 bit buffer. Since most sites are using 2048 bit RSA keys this is fine. But for those sites/users that have a 4096 bit RSA key the fastmath buffer size needs to be increased to 8192. Since your certs use 4096 bit RSA keys, you'll need to increase the size by modifying the define


in <cyassl_root>/cyassl/ctaocrypt/tfm.h, and setting it to 8192.

Sunday, February 17, 2013

Load SSL certificate from memory using cURL and CyaSSL

The cacertinmem sample in cURL shows you how to do this with OpenSSL, but it's slightly different if you use CyaSSL, and actually simpler.

From the CyaSSL manual:

To use the extension define the constant NO_FILESYSTEM and the following functions will be made available:
int CyaSSL_CTX_load_verify_buffer(...)
int CyaSSL_CTX_use_certificate_buffer(...)
int CyaSSL_CTX_use_PrivateKey_buffer(...)
int CyaSSL_CTX_use_certificate_chain_buffer(...)
Use these functions exactly like their counterparts that are named fileinstead of buffer. 
And instead of providing a filename provide a memory buffer

Use the NO_FILESYSTEM preprocessor definition when building CyaSSL, cURL, as well as in your own project.

In my case, I wanted to load a bundle of CA certificates, and would therefore use CyaSSL_CTX_load_verify_buffer() (please turn to the CyaSSL API reference).

In addition, you must be able to use the SSL callback function which isn't supported in cURL (version 7.29.0) when using SSL libraries other than OpenSSL. To solve this problem, read my article about the SSL context function with cURL and CyaSSL.

Bundle of CA root certificates

Useful when you don't know which SSL providers were used. I got it from

I compressed the CA certificates with GZip before embedding them. Poco (C++ framework) took care of decompresson.

Reduction: 115 KB / ~45% (245 KB down to 130 KB).

Probably sensible enough if GZip (or whatever you want to use) is used in your application already elsewhere (thinking about increased code size).

#include "CaCert.h"

#include <curl/curl.h>
#include <cyassl/ssl.h>

#include <string>

// ...

CURLcode sslContextCallback(void* ctx) {
    // Load CA certificates from memory
    CaCert cert;
    const std::string& certData = cert.GetData();
        reinterpret_cast<const unsigned char*>(certData.c_str()),
    return CURLE_OK;

// ...
curl_easy_setopt(curl, CURLOPT_SSL_CTX_FUNCTION, *sslContextCallback);
#ifndef __CaCert__
#define __CaCert__

#include <string>

class CaCert {
    const std::string& GetData();
    static const unsigned char m_cacert_pem_gz[];
    std::string m_cacert_pem;

#endif // guard
#include "CaCert.h"
#include "cacert.pem.gz.h"
#include <Poco/InflatingStream.h>
#include <sstream>

const std::string& CaCert::GetData() {
    if (m_cacert_pem.empty()) {
        std::string gzstr(reinterpret_cast<const char*>(&m_cacert_pem_gz[0]), sizeof(m_cacert_pem_gz));
        std::istringstream istr(gzstr);
        Poco::InflatingInputStream inflater(istr, Poco::InflatingStreamBuf::STREAM_GZIP);
        std::stringstream ostr;
        ostr << inflater.rdbuf();
        m_cacert_pem = ostr.str();

    return m_cacert_pem;

This file was generated using Hex Workshop and modified slightly.

// Generated by BreakPoint Software's Hex Workshop v6.6.1.5158
//  Source File: C:\code\curltest\cacert.pem.gz
//         Time: 17.02.2013 04:55
// Orig. Offset: 0 / 0x00000000
//       Length: 133503 / 0x0002097F (bytes)
const unsigned char CaCert::m_cacert_pem_gz[133503] = { /*...*/ };

Reduce size and memory footprint

We don't always need all of the CA root certificates. We can remove all of the certificates we don't need, and any comments. In my case, I needed only StartCom certificates.

I skipped compression since the file was only 7.37 KB.

Reduction: 236.63 KB / ~96.9%.

#include "StartComCaCert.h"

#include <curl/curl.h>
#include <cyassl/ssl.h>

#include <string>

// ...

CURLcode sslContextCallback(void* ctx) {
    // Load CA certificates from memory
    StartComCaCert cert;
    const unsigned char* certData = cert.GetData();
    return CURLE_OK;

// ...
curl_easy_setopt(curl, CURLOPT_SSL_CTX_FUNCTION, *sslContextCallback);
#ifndef __StartComCaCert__
#define __StartComCaCert__

#include <string>

class StartComCaCert {
    const unsigned char* GetData() const;
    unsigned int GetSize() const;
    static const unsigned char m_cacert_StartCom_pem[];

#endif // guard
#include "StartComCaCert.h"
#include "cacert_StartCom.pem.h"

const unsigned char* StartComCaCert::GetData() const {
    return &m_cacert_StartCom_pem[0];

unsigned int StartComCaCert::GetSize() const {
    return sizeof(m_cacert_StartCom_pem);

This file was also generated.

// Generated by BreakPoint Software's Hex Workshop v6.6.1.5158
//  Source File: C:\code\curltest\curltest\cacert_StartCom.pem
//         Time: 18.02.2013 01:23
// Orig. Offset: 0 / 0x00000000
//       Length: 7547 / 0x00001D7B (bytes)
const unsigned char StartComCaCert::m_cacert_StartCom_pem[7547] = { /*...*/ }

It'll be ridiculous to continue, but let's do it!

Looking at the certificates as they are stored in PEM format, our precious bytes are obviously wasted while it tries to look pretty to humans.

While keeping the base-64-encoded data (which also wastes space), let's go ahead and write a small Python script to minify the file.

It removes everything at the beginning of the file (comments, newlines), removes repeated equal signs, and newlines mixed in the base-64-encoded data.

import re

def stripNewlines(match):
    return + +"\n", "") +

inFile = open("cacert_StartCom.pem", "r")
caCertData =

# Strip comments and everything else in the beginning of the file
result = re.sub(r"(?s)^.*?\n(\w)", r"\1", caCertData)
# Strip the newlines we can strip
result = re.sub(r"(?s)(=)+\s*(\s-----BEGIN CERTIFICATE-----\s)(.*?)(\s-----END CERTIFICATE-----\s)\s*", stripNewlines, result)
# Strip remaining newlines at the end
result = result.rstrip()
outFile = open("cacert_StartCom.min.pem", "w")

Reduction: 175 bytes / ~2.32%.

Just for the sake of it, minify the original cacert.pem

Reduction: 7.53 KB / 3%.
Comparing old and new GZipped file: 5.98 KB / 4.59% reduction.
Total reduction: 121 KB / 49.32%.

SSL context function with cURL and CyaSSL

For some reason, cURL disables this functionality when using SSL libraries other than OpenSSL. To solve the problem, we need to dive into the cURL source code for a bit.

This code applies for cURL version 7.29.0.

Source file: cyassl.c, cyassl_connect_step1()

Declare and define variable at the top:

CURLcode retcode = CURLE_OK;

The rest of the code:

// ...
/* give application a chance to interfere with SSL set up. */
if(data->set.ssl.fsslctx) {
    retcode = (*data->set.ssl.fsslctx)(data, conssl->ctx, data->set.ssl.fsslctxp);
    if(retcode) {
        failf(data,"error signaled by ssl ctx callback");
        return retcode;
#endif /* NO_FILESYSTEM */

Most of this was copied from ssluse.c.

Source file: url.c, Curl_setopt()

Look for the following code.

    /* since these two options are only possible to use on an OpenSSL-
       powered libcurl we #ifdef them on this condition so that libcurls
       built against other SSL libs will return a proper error when trying
       to set this option! */
     * Set a SSL_CTX callback
    data->set.ssl.fsslctx = va_arg(param, curl_ssl_ctx_callback);
     * Set a SSL_CTX callback parameter pointer
    data->set.ssl.fsslctxp = va_arg(param, void *);
    data->set.ssl.certinfo = (0 != va_arg(param, long))?TRUE:FALSE;

Change the #ifdef:

#if defined(USE_SSLEAY) || defined(USE_CYASSL)

Project files: *.vcxproj

Add preprocessor definition if needed: USE_CYASSL

Using it

CURLcode sslContextCallback(CURL* curl, void* ctx, void* param) {
    return CURLE_OK;

// ...
curl_easy_setopt(curl, CURLOPT_SSL_CTX_FUNCTION, *sslContextCallback);

SSL context function with cCURLpp

The function signature for cURL can't be used with cURLpp.

Expected code (not working)

If you're familiar with cURL, you would probably try this first.

CURLcode sslContextCallback(CURL* curl, void* ctx, void* param) {
    return CURLE_OK;

// ...
using namespace curlpp;
Easy easy;
// ...
easy.setOpt(new options::SslCtxFunction(sslContextCallback)); // CURLOPT_SSL_CTX_FUNCTION in cURL

This would give the following error (VC++):

error C2198: 'CURLcode (__cdecl *)(CURL *,void *,void *)' : too few arguments for call

Code for cURLpp

CURLcode sslContextCallback(void* ctx) {
    return CURLE_OK;

// ...
using namespace curlpp;
Easy easy;
// ...
easy.setOpt(new options::SslCtxFunction(sslContextCallback)); // CURLOPT_SSL_CTX_FUNCTION in cURL

I'm not sure at this time how to get the optional extra data, but for now, I don't need it either.