Friday, April 1, 2016

Build 2016, Day Two

Back in the hotel after another eventful day. As tired as yesterday. I’m getting to old for this, I think.

Awesome news during the keynote today. As you’ve probably already have heard by now, Microsoft is going to be including Xamarin for free together with every version of Visual Studio. Yes, every version. That does include the Community Edition, which means that you will be able to develop iOS, Android and Windows Phone apps for free!

Well, you still have to have a mac with xcode installed for iOS. And you need to pay all three the developer fees. BUT! That you would’ve had to do either way.

And best of all - you will use C# for them all. Which is really nice. For obvious reasons.

You still won’t have free access to the Xamarin Test Cloud though. Which I can understand, there’s a lot of infrastructure behind that one! You will be able to buy it as a standalone service and it will be accessible through Visual Studio Teams Services. Nice having everything in the same place, and they showed some real impressive deploy and test functionality for it.

We actually tried out Xamarin for our last internal conference, NAPC15. Personally I really digged it. Awesome being able to share code in between different platforms, and using Xamarin Forms a lot of the UI could be shared as well, cutting down on development time for everyone. Really nice being able to preview the iOS from your windows machine as well with their remote viewer for the iOS simulator.

But, I digress. Back to Build!

Was a lot of talk about Azure today. And about how awesome it is. They now have 30 unique Azure regions and locations around the world, making sure everything is served as locally as possible. Quite impressive!

Another really impressive number is that over 85% of the Fortune 500 companies use Azure in some capacity!

Saw one really cool feature of Azure that I’ve totally missed until now… RemoteApp. You remotely connect straight to an app hosted in the cloud. This means you can access it from whatever device, pretty much. The example I saw was running Visual Studio on a Windows Lumia 950 phone, through Continuum. Super cool!

Angular have had quite a presence during this Build as well. With Angular 2 soon coming out, it’s been a lot of talk about it. Seems rather sweet, they have definitely gotten up the speed even more, and have nice solutions when it comes to precompiling templates on .Net, Java, Node.js and more. They wrote all of Angular 2 using Typescript, which definitely tells us how good Typescript is becoming as well.

I definitely look forward to my third and last day of Build 2016 tomorrow. I can’t wait to see what more goodies we will learn! Until then - Good night America… wherever you are.

…Oh yeah. Midnight Caller, 1988. Good times, good times.

Thursday, March 31, 2016

Build 2016, Day One

First day of Build 2016 is over, at least for my part. It’s been fun, a bit chaotic, and man… My feet hurt like crazy!

After registering I went straight up to get in line for the keynote. Good call, there were already at least 200 other that had the same great idea - 1.5 hours before it was to start! Pretty much everyone attends the keynote, which is the broader look at what Microsoft has in store for the near and far future.

After some fluff about how they are to empower every person and organization on the planet, and figures about how many Windows 10 installations that have been made (270+ million. whoah!) they dropped the first bit of good news - the Windows 10 Anniversary Update is coming this summer. Seem to have a lot of goodies in that one, for sure! And, it will be totally free. Awesome!

They went on with some news about Microsoft Edge. They will integrate the Windows Hello authentication so you can get to it by some simple javascript api’s. Great for logins to different sites etc - albeit, if people want to log in with their fingerprint there’s already good solutions for it with third party software - that works across browsers. My 2 cents: Good feature, sure, but just another Edge-exclusive api.

Big news for anyone looking into development for the Xbox One! From this summer (and out as a preview TONIGHT) any retail Xbox can become a developer edition by just flicking a checkbox in the settings. This removes the ability to start any games and apps you have installed, but instead you can connect to your Xbox from a Visual Studio 2015 installation and install any Universal Window Platform application to test. When finished with the app, just release it to the store as usual. If marked as Xbox compatible, it will automatically turn up in the Xbox store as well. Pretty darn sweet! You can of course turn it back into a regular non-dev console at any time.

I’m really starting to dig the whole UWP thing. Will have to mess around with that a bit more. You get a lot for free as well, they automatically hook up the controller instead of mouse events etc on the Xbox, and touch on phones/tablets/touchscreens. Of course you can also very easily override it all to tailor it for each device.

They have talked a lot about bots and experts. Experts are essentially your app telling Cortana how she can use it, which keywords to trigger on etc. Seems easy enough to hook up to her. No pun intended. I really like this. Instead of going the route Apple is taking with Siri, as well as Google with Google Now, they let developers around the world extend Cortana as they see fit, and essentially make her smarter by helping out with the brain power. That also means she will become more tailored to you since it’s based on what applications you have installed giving her a boost through experts.

Integrations with Cortana seem to be huge within Microsoft as well. They showed some really cool (and kinda scary) demos of Cortana reading through your emails, looking for things you might have forgotten and automatically add as reminders and calendar events. Did you tell your mom you’ll call her tomorrow instead? Busted; this time Cortana will remind you of your white lie.

Bots are… Well, bots. Satya Nadella (Microsoft's CEO) was talking about a world powered by bots, where conversation is the platform, and voice is the UI. Good examples would be an automated bot taking your order at a drive-through restaurant, and other bots residing inside your favorite messaging app (which of course during Build everyone agrees is Skype no matter what we use otherwise). These bots could trigger on certain keywords, for instance if you talk about being hungry, instantly ask you if you want to order your favorite pizza. Like even easier access to pizza is a good thing for the world.

I can definitely see a lot of potential in this, and it really hit me when they showed a video of a blind man having helped design technology that, with a pair of smart glasses, let him take pictures, and have it described back to him through speech synthesis. Technology helping a blind person to see the world. Simply amazing.

And in other news;

  • Visual Studio Update 2 was released today.
  • Hololens is ready, and started to ship to preorder customers today.

Coincidence? I… Think… Not!

Hopefully I have the strength to stand in that humongous line tomorrow and let you guys know if it’s worth the hype or not. If, I manage to keep my sanity after standing in that line. That line. That line scares me.

And alas, I saved the best for last!

Bash is coming to Windows! Amazing news! We’re talking real Bash, with all your favorite gnu tools, finally native on windows.

Yes, I am a happy camper. A tired one though, so good night for now!

Friday, February 19, 2016

Documentation with code examples - the maintainable(?) way

We keep an internal base code platform at Nansen, which several of our sites are based. We like to think of this as our bread and butter. Meaning, those parts of every project you copy+paste to the next. Sadly, we have been quite bad at documenting it, or we have been documenting it, but the problem is often that the code examples tend to get out-of-date as code naturally evolve with time.

To alleviate this I wanted to create a way to keep the documentation and examples to support it up-to-date, but also contain readable text. One could argue that an SDK is a way, but SDK's usually contain way too much fluff in the way of tables upon tables of method, property and event definitions. And they are also plagued by the out-of-date example code problem above, as examples are written in the <example>-section of the summary documentation. It is easy to get lazy and forget about an example if a method or class is updated.

Anyways, my colleague Andreas 'Honungsprinsen' Oldeskog used a clever way to document our front-end framework by pulling markdown from github, and generating documentation pages from those files. I wanted to find a similar way, but with c# source code instead. This gave me three problems to solve:

  1. Having code examples that actually compile, and are relevant to the code I want to document.
  2. The ability to easily attach human readable documentation and informational text to said code.
  3. Some way of combining 1 and 2 in order to present it to the developers (hopefully) reading the documentation.

Now, for problem 1, part of the solution was so simple Andreas had to point it out to me: Use tests! I could just write unit tests, they run the actual code, and you can use assertions to verify the examples actually do what you show to the reader of the documentation.

The second part of the solution to problem 1 was finding a way to actually run the tests and present them to the reader. This was solved by using NUnit, and its test runner functionality somewhat hidden away in Nunit.core.dll, an assembly that is unfortunately not distributed along with nunit.framework.dll in the NUnit nuget package. I found it tucked away in a package named NUnit.Runners, and I had to manually reference the assemblies nunit.code.dll and nunit.core.interfaces.dll from my web project designated to run and present the results of my example tests.

When all that is achieved, it is pretty simple to actually run tests:

public class TestResultsEventListener : EventListener
    public TestResultsEventListener()
        ListResults = new List<testresult>();

    public List<testresult> ListResults { get; }

    public void TestFinished(TestResult result)

    // noop all other methods

public List<testresult> RunSomeTests(IEnumerable<string> testNames)
    var testPackage = new TestPackage("ExampleCode"); // this can be named anything

    var eventListener = new TestResultsEventListener();
    var runner = new SimpleTestRunner();
    if (runner.Load(_testPackage)) {
        var filter = TestFilter.Empty;
        if (testNames != null && testNames.Any()) {
            filter = new SimpleNameFilter(testNames.ToArray());
        runner.Run(eventListener, filter, true, LoggingThreshold.All);
    return eventListener.ListResults;

The result is a neat list of all our tests and their results, this solved problem 1, neat!
A bonus of this approach is also, since we use Episerver in our project platform, and the documentation/demo site is a basic implementation of said platform, all the service loading for epsierver is already done and the site is running so I can use real data for any code interacting with the site without having to mock my heart out (seriously, unit testing episerver code is 98% service mocking).

For problem 2, I decided to just use markdown syntax in the summary documentation of each test. If a more generic description or informative text was needed, I used the summary documentation of the test class. I also made sure to comment as much as possible of the test method code.

Now, problem 3 was trickier, I needed to somehow parse both the code and summary documentation for my test cases, and present the result. Cue Roslyn! Using the new Roslyn code parser Microsoft came up with, I could easily read a source file, and extract whatever I wanted.

Combining all three solutions above gave me a helper that:

  1. Found all tests for a specific class,
  2. Ran the tests, and recorded the results,
  3. Parsed the source code for said tests and formatted the result

The code for my helper:

public class ExampleCodeHelper
    private readonly DirectoryInfo _baseDirectory;

    private readonly TestPackage _testPackage;

    public ExampleCodeHelper(string baseDirectory, IEnumerable<string> assemblyNames)
        _baseDirectory = new DirectoryInfo(baseDirectory);
        _testPackage = new TestPackage("ExampleCode");
        foreach (var assemblyName in assemblyNames) {

    public static ExampleCodeHelper Default
            var root = HttpContext.Current.Server.MapPath("~") + ConfigurationManager.AppSettings["examplecode.sourcefile.rootpath"];
            var assemblies = ConfigurationManager.AppSettings["examplecode.test.assemblies"].Split(';');
            return new ExampleCodeHelper(root, assemblies);

    private IEnumerable<string> CodeFiles => Directory.GetFiles(_baseDirectory.FullName, "*.cs", SearchOption.AllDirectories);

    private IEnumerable<TestResult> RunTests(IEnumerable<string> testNames)
        var eventListener = new TestResultsEventListener();
        var runner = new SimpleTestRunner();
        if (runner.Load(_testPackage)) {
            runner.Run(eventListener, new SimpleNameFilter(testNames.ToArray()), true, LoggingThreshold.All);
        return eventListener.ListResults;

    public ClassDeclarationSyntax GetTestClass(string classFullName)
        var definitions = GetAllTestClassDefinitions();
        var classSymbol = definitions.Keys.FirstOrDefault(k => k.ToString() == classFullName);
        return classSymbol != null ? definitions[classSymbol] : null;

    /// <summary>
    /// Retrieves a <see cref="ExampleCodeContent"/> containing a number of <see cref="TestResultItem"/> objects with test results
    /// </summary>
    /// <param name="classFullName">Full name (including namespace) for the test class</param>
    public ExampleCodeContent GetTestResults(string classFullName)
        var definitions = GetAllTestClassDefinitions();
        var classSymbol = definitions.Keys.FirstOrDefault(k => k.ToString() == classFullName);

        if (classSymbol == null) {
            return null;

        var classDeclaration = definitions[classSymbol];

        var content = new ExampleCodeContent {
            Text = GetSyntaxNodeDocumentation(classSymbol),
            ContainerClass = classDeclaration

        var testCases = classDeclaration.DescendantNodes()
                                        .Where(m => HasNamedAttribute(m.AttributeLists, nameof(TestAttribute)))

        var tests = RunTests(testCases.Select(t => $"{classSymbol.ToString()}.{t.Identifier.ValueText}"))
            .ToDictionary(key => testCases.Single(m => m.Identifier.ValueText == key.Test.MethodName), value => value);

        content.TestResults = tests.Keys.Select(method => new TestResultItem {
            Method = method,
            TestResult = tests[method],
            Text = GetSyntaxNodeDocumentation(GetDeclaredSymbol(method))

        return content;

    /// <summary>
    /// Returns all test class definitions found in the defined source folder.
    /// </summary>
    /// <remarks>
    /// Only classes decorated with the <see cref="TestFixtureAttribute"/> attribute is returned.
    /// </remarks>
    public IDictionary<ISymbol, ClassDeclarationSyntax> GetAllTestClassDefinitions()
        var classes = new List<ClassDeclarationSyntax>();
        foreach (var file in CodeFiles) {
            try {
                var testClasses = CSharpSyntaxTree.ParseText(File.ReadAllText(file))
                                                  .Where(c => HasNamedAttribute(c.AttributeLists, nameof(TestFixtureAttribute)));

            catch (Exception) {
                // just ignore failing source files

        return classes.ToDictionary(GetDeclaredSymbol, value => value);

    #region helper methods

    private static bool HasNamedAttribute(SyntaxList<AttributeListSyntax> attributes, string attributeName)
        var simpleAttributeName = attributeName.EndsWith("Attribute")
            ? attributeName.Substring(0, attributeName.LastIndexOf("Attribute", StringComparison.Ordinal))
            : attributeName;
        return attributes.SelectMany(a => a.Attributes).Any(a => a.Name.ToString().EndsWith(simpleAttributeName));

    /// <summary>
    /// Retrives the <<paramref name="documentationSection"/>> documentation element "innertext" from the provided <see cref="ISymbol"/>
    /// </summary>
    private static string GetSyntaxNodeDocumentation(ISymbol symbol, string documentationSection = "summary")
        var methodDocumentation = symbol.GetDocumentationCommentXml();
        if (!string.IsNullOrEmpty(methodDocumentation)) {
            var doc = XDocument.Parse(methodDocumentation);
            var nodes = doc.Descendants(documentationSection).FirstOrDefault()?.Nodes() ?? Enumerable.Empty<XNode>();
            return ParseSeeCrefs(string.Concat(nodes));
        return null;

    /// <summary>
    /// Changes any <see cref="TEXT" /> into <em>TEXT</em>, making it usable in html
    /// </summary>
    private static string ParseSeeCrefs(string inputText)
        if (string.IsNullOrEmpty(inputText)) {
            return inputText;
        var seeRegex = new Regex("< *see( +)cref=\"[^\"]:([^\"]+)\" */>");
        return seeRegex.Replace(inputText, m => "<em>" + m.Groups[2].Value + "</em>");

    /// <summary>
    /// Returns the declared <see cref="ISymbol"/> for a specified <see cref="SyntaxNode"/>
    /// </summary>
    private static ISymbol GetDeclaredSymbol(SyntaxNode syntaxNode)
        return CSharpCompilation.Create("MyCompilation", new[] {syntaxNode.SyntaxTree}).GetSemanticModel(syntaxNode.SyntaxTree).GetDeclaredSymbol(syntaxNode);


With it, I can call ExampleCodeHelper.Default.GetTestResults("MyTestAssembly.MyTestResultClass") to compile a simple data model (source not shown), with documentation, and test source code. The result looking something like below:

Source code:

/// <summary>
/// #Content utilities#
/// <see cref="ContentUtility"/> contain a number of useful methods and extensions for Episerver content.
/// Basically, it can be seen as extension methods for <see cref="IContentLoader"/>. Though there are a number
/// of addiotional useful functions added.
/// Important to know is that the extensions, as opposed to the <see cref="IContentLoader"/>, will return null
/// instead of throwing a <see cref="EPiServerException"/> if not found or otherwise failing.
/// </summary>
[TestFixture(Category = "Episerver", Description = "Content utillties")]
public class ContentUtilityTests
    public void Load_Content()
        var contentLink = SiteDefinition.Current.StartPage;

        //load content
        var content = contentLink.Get<IContent>();
        Assert.AreEqual(contentLink, content.ContentLink);

        //load content typed
        var startPage = contentLink.Get<PageData>();

        // a wrong type or otherwise will return null
        var invalid = contentLink.Get<BlockData>();

    public void Load_content_using_a_specific_language()
        var contentLink = SiteDefinition.Current.StartPage;
        var content = contentLink.Get<PageData>(LanguageLoaderOption.FallbackWithMaster());

Rendered result:

Thursday, January 14, 2016

NDC London 2016 - På plats

Nansen är på plats i London för att delta i konferensen NDC London. Efter nästan två dagar har vi hunnit med ett antal föreläsningar. Vi har fått höra om nyheter i ASP.NET 5, MVC 6 och Entity Framework 7. Och också hur man kan attackera sin egen sajt för att identifiera framtida problem och innan de blir problem. Det har även varit mycket prat om Azure och automatiseringar. Jon Skeet och Scott Allen har även haft en liten battle på scen där de fick tävla om att svara på frågor från stackoverflow. Tycker nog att Jon gick vinnande ur den kampen,
Förhoppningsvis så kommer även alla talks snart publiceras på vimeo.

Christer Ottosson får smaka på Jon Skeets magi. 

Christer Ottosson, Jenny Atmer och Kalle Hoppe

Tuesday, January 12, 2016

Nansens protein junkies bevakar sina intressen

Friskvårdsbidragen nyttjas frekvent på Nansen. Kokosbollarna och kakorna är inte fullt lika poplära som under andra tider på året. Ägg är hårdvaran just nu.

Wednesday, November 11, 2015

Nansen Stockholm- kontorsliv

(Swedish) Kära besökare. Vi på Nansen gillar att dela med oss av både mjuka och hårda värden och även låta er ta del av högt och lågt vad gäller vad vi sysslar med. Fluffiga ord som inte alltid är lätta att konkretisera. Idag så ska jag dock ge er ett exempel på viktiga detaljer som vi brottas med men vars innehållsvärde kanske inte ligger högst upp på skalan.

Vi har haft en jobbig vattenläcka på stockholmskontoret vilket medför en del olägenhet just nu. Ett rör har stått och läckt en längre tid vid en av toaletterna och det har varit hantverkare här som hållit på och rivit ner väggar och golv. S ånu står det fläktar och torkar ut en del av lokalen. Fläktarna brummar en del och därför vill vi gärna ha stängt mellan den delen av lokalen och där vi sitter och jobbar i kontorslandskapet så folk slipper ådra sig tinnitus som arbetsmiljörelaterad skada.
Dörrhelvetet som ska dras igen krånglar dock lite och den är lite trixig att stänga igen. Genierna här på Nansen är dock inte bara jevvligt skarpa på att koka webb, vi är även rätt vassa på att skapa pedagogiska lappar och instruktionsvideos. Så, naturligtvis löste vi dörrstängningsproblematiken genom att sätta upp en otroligt pedagogisk lapp samt en instruktionsvideo för hur man går tillväga för att på ett korrekt sätt stänga dörren trots att den är lite kajko.

Friday, October 9, 2015

"Koka webb" in progress

What you see here is actually a rockstar and a sailor discussing some very intricate system architecture structure stuff at Nansen HQ, Stockholm. This is what we call "koka webb"

Friday, October 2, 2015

Timeout exception during PageTypeBuilder update - DefaultValueType need a companion

TL DR; On a page property in PageTypeBuilder (PTB) - If you have DefaultValueType = DefaultValueType.Value, you must have DefaultValue set to something. Otherwise your site might cause timeout on startup while PTB does it’s thing.


The last months we've had problems with a clients test environment (EPiServer 6) regarding nasty timeouts while PageTypeBuilder does it's update on site startup. I spent some time cleaning the page types from missing/conflicting sort index on properties and removed tons of old page types that weren't in use anymore.

The issue

After the above mentinoed optimization, it felt like the startup ran smoother, but it was all an illusion. Every startup when PTB ran it's update, we got this exception:
Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
At first I assumed it was network related since the hosting provider have had problems during this period with their firewall and other network issues.

The solution

After I configured EPiServerLog.config to log level ALL, I began to see the truth.
Just before every timeout error there was the same page property, over and over again.

2015-10-01 13:54:46,383 DEBUG [39] PageTypeBuilder.Synchronization.PageDefinitionSynchronization.PageDefinitionUpdater.UpdateExistingPageDefinition - Updating PageDefintion, old values: Name: AddressSlimInfoText|Type:|EditCaption:Prenumerationen registreras på din..|HelpText:|Required:False|Searchable:False|DefaultValue:|DefaultValueType:None|LanguageSpecific:False|DisplayEditUI:True|FieldOrder:1120|Tab.ID:166|,new values: Name: AddressSlimInfoText|Type: |EditCaption:Prenumerationen registreras på din..|HelpText:|Required:False|Searchable:False|DefaultValue:|DefaultValueType:Value|LanguageSpecific:False|DisplayEditUI:True|FieldOrder:1120|Tab.ID:166|.
I navigated to the code for page type that had this property and saw this:
[PageTypeProperty(EditCaption = "Prenumerationen registreras på din..",
   Type = typeof(PropertyString),
   Tab = typeof(PurchaseFlowAddressSlimTab),
   DefaultValueType = EPiServer.DataAbstraction.DefaultValueType.Value,
   SortOrder = 1120)]
public virtual string AddressSlimInfoText { get; set; }

Took a while before I saw what was different from the other properties. This:
DefaultValueType = EPiServer.DataAbstraction.DefaultValueType.Value
But no DefaultValue was set.

I corrected this and then updated the test environment. Still got the same error, but now the log told me about the next property that had the same issue, and the next and the next etc.

So I had to go through all the page type properties in the code that used DefaultValueType and check to see if it had the matching DefaultValue or not.

When this was done, took a while, the site has about 100 page types and I found about 10-15 properties that had to be corrected, I updated the test site again and tried to trigged the timeout.

But alas, there was none.


It seems weird that this causes a timeout, there should be an ArgumentException or something further inside the EPiServer.DataAccess-assembly that actually does the page definition update. So when you set DefaultValueType to Value, you have to set DefaultValue to something

For us I think it was a combination of unstable network environment and time consuming PTB update that triggered this timeout. As a developer, at least you can give PTB the best possible conditions so it runs as smooth as possible.

Anywho…  if you have weird timeouts on an Epi 6 site. Check the obvious PTB mistakes such as duplicate/missing sort index etc, but also check the DefaultValueType/DefaultValue pairings.

Thursday, September 17, 2015

Getting Visual Studio intellisense to play nice with RequireJS

I was trying to get the intellisense to work in my javascript files in my ASP.NET MVC project, for which i'm using RequireJS and here's how it all worked out.

Working with _references.js

Visual studio is using a file called _references.js to grant you the wisdom of intellisense in JavaScript files. First off, we need to make sure we have our _references.js file where Visual Studio can find it. By default Visual Studio will search for this file in the location "~/Scripts/_references.js" if your'e working with a web project. However if your scripts-folder resides in a different area of your project you can easily change where Visual Studio looks for this file.

Options → Text Editor → JavaScript → IntelliSense → References

In the dropdown called "Reference Group" choose "Implicit (Web)".

If you're interested you can read more on the history of _references.js here

Configuring _references.js for RequireJS

I was reading up on the subject on at but i was having issues with how the path was resolved to my RequireJS modules. So my scripts wasn't loaded correctly. 
While reading the debug messages in the output window for JavaScript Language Service, saw from where Visual Studio tried to load my files, C:\Program Files (x86)\Microsoft Visual Studio 14.0\JavaScript\References\require.config.

The solution was to configure the basePath for RequireJS in the _references.js file, and after RequireJS will handle the dependencies you have in your modules and Visual Studio will grant you IntelliSense.


/// <autosync enabled="true" />
/// <reference path="Vendor/require.js" data-main="main.js" />

baseUrl: '~/Assets/Scripts/'

Decorate your scripts with comments

In Visual Studio 2015, Microsoft has added support for JSDoc. Which is a documenting/comment syntax, and if you use it you get really sweet intellisense in correlation with _references.js. So instead of just getting what functions and properties an object might have you also get information of parameters and types if you choose to write it. Visual Studio also has support for xml based comments to complement your intllisense, but personally i prefer JSDoc.

You can read the documentation of JSDoc at

E.g. creating a module like this will grant you the intellisense of the picture below.


* @module modules/awesomo
define('modules/awesomo', [
], function(_){
'use strict';

return {

/** Will return a random number between 0 and 100. */
randomNumber: function(){

return _.random(0, 100);

* AWESOM-O will try to give you medical help
* @param {string} bodyPart - The bodypart in need of medical assistance
giveMedicalHelp: function(bodyPart){

return 'Um, actually A.W.E.S.O.M-O is not programmed for that function.';


Happy coding.

Thursday, June 25, 2015

Offline, can not save EPiServer 7.x

We recently started getting an odd error message whenever we edited a page. EPiServers auto save functionality was throwing an error in the console

POST http://mysite.local/episerver/cms/Stores/contentdata/ 500 (Internal Server Error)”.

We searched the internets for some clues about what could be wrong. Many posts hinted about page properties beeing missmatched with the settings in the database

Further digging into the logs we found this

Here’s the stack trace:
[InvalidOperationException: This request has probably been tampered with. Close the browser and try again.]
   EPiServer.Framework.Web.AspNetAntiForgery.ThrowForgeryException() +374
   EPiServer.Shell.Services.Rest.RestHttpHandler.ProcessRequest(HttpContextBase httpContext) +109
   System.Web.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +913

   System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +165


ERROR - 1.2.5 Unhandled exception in ASP.NET
System.InvalidOperationException: This request has probably been tampered with. Close the browser and try again.
   at EPiServer.Framework.Web.AspNetAntiForgery.ThrowForgeryException()
   at EPiServer.Shell.Services.Rest.RestHttpHandler.ProcessRequest(HttpContextBase httpContext)
   at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
   at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)

ERROR - Cross-site request forgery detected [Client IP: XX.XX.XX.XX, Referer: http://mysite.local/episerver/CMS/#context=epi.cms.contentdata:///317, Url: http://mysite.local/episerver/cms/Stores/contentversion/, User: UserName]

although, the error that led us to the solution was this little fella

"the required anti-forgery cookie __requestverificationtoken is not present"

It turns out that we had marked the cookies as secure (as we all should) with the configuration setting
<system.web><httpCookies requireSSL="true" /></system.web>

But we were accessing the site with http. So the real underlying error was that last one, “the required anti-forgery cookie __requestverificationtoken is not present”. The site was requesting secure anti-forgery cookies but was getting standard unsecure cookies, thus the tampering exception

The solution, query the site with https or change the setting to requireSSL=”false”