Flux is a new space for working together
Last Thursday, Microsoft Finland launched a new venue in downtown Helsinki. Microsoft Flux is a mixture of a co-working space, a tech playground and a startup accelerator. A few startups work there, but anyone is free to visit the place and enjoy free coffee, hardware lab and good company. We will also see more and more events being run in Flux â itâs a nice place in a central location after all.
Check Flux out at www.microsoftflux.com!
Slack is a new way of communicating
Really, Slack is old news. Itâs a well-known real-time chat platform for teams, used in many companies. What is new is that we have today opened up a Slack network for Finnish Microsoft Professionals. Feel free to join the network â ask questions, rant or plan the next community event!
Join the MSgurut Slack network at msgurut.azurewebsites.net!
With the ticket sales storm of Build 2016, it is now officially time to reinvent your IT conference policy.
Microsoft-focused software companies have traditionally educated their developers in conferences run by Microsoft. This has been both a boon and a bane â many devs have learned their Microsoft stack well and with reasonable cost, but at the same time they have grown into the narrow-minded monoculture that has so pervaded the enterprise dev conferences. But things are changing.
Back in the old days, Microsoftâs offering for developers has been twofold: There has been the Professional Developers Conference (PDC) for product launches and a future view, and TechEd for continuous education. PDC wasnât an every year event, but reserved only for years where there was something special to announce. TechEd was held every year, around the world â and was aimed at both developers and IT professionals.
PDC became Build, and weâve had it every year now. It has always been a US-only event. TechEd later swallowed a few other conferences and then reincarnated into Ignite, and it has only been held in the US (once this far). Although Ignite has some developer sessions, it is really geared at platform understanding. Not really the material for most developers, at least if Ignite 2015 can be judged upon.
The disappearing dev conferences
With Ignite having retreated to the US and IT Pro worlds, developers are only left with Build. And you canât use Build to educate your developers. This year, those few thousand tickets they sell for a BuildâŠ they sold out in five minutes.
So even if youâre ready to pay for the cost of a three-day conference in the rather expensive San Francisco, thereâs no guarantee youâll get in.
And if you happen to be one of the lucky ones, the conference is all about laying the groundwork for the future, not education for your everyday dev job.
SoâŠ With little fanfare, we have moved into a world where Microsoft dev partners need to look outside Microsoftâs conference offerings to educate their developers. Your local Microsoft subsidiary may have something, but rarely anything even close to the magnitude of a TechEd in your own continent.
What if you're a Finn?
If you're into ASP.NET Core (formerly known as ASP.NET 5) development, also check out our one-day ASP.NET Core seminar!
What to do in 2016?
Look for other options. They exist both locally and globally. Here are some ideas, mostly from a Finnish perspective:
- NDC Oslo â in Norway, June
- Build Stuff 2016 â in Lithuania, November
- DevIntersection: Orlando in April, Las Vegas in October or Amsterdam in November
- VSLive! â 7 four-day event around the US
There are probably a lot of others â typically thereâs been an Oredev (in MalmĂ¶, Sweden), but Iâm seeing no details for 2016 yet. Many small conferences have improved over the last few years, and getting more attendees will help them improve further.
Also, start taking your online resources seriously. Channel9, Microsoft Virtual Academy and Pluralsight are nothing to be scoffed at. Employers need to accept that just doling out subscriptions and study time may well be one of the most effective approaches to some learning challenges.
If youâre running an app in Windows 10 or Windows Server 2016 under .NET 4.0 or newer, you will see date parsing errors under seven locales. This blog post will explain the issue, the fix (itâs coming!) and the workarounds.
First off, I would like to thank the .NET Framework team at Microsoft for fixing this speedily. Jay Schmelzer and his people also helped by giving me all the details and proofreading this blog post. That said, my recommendations and any possible mistakes are still my own, not theirs. :-)
Edit 2015-10-09: This has been fixed now. The updates are as follows:
- KB3093266 – Windows 10
- KB3088956 – Windows Server 2012 R2 and Windows 8.1
- KB3088955 – Windows Server 2012 and Windows 8
- KB3088957 – Windows 7 SP1, Windows Server 2008 SP2, Windows Server 2008 R2 SP1, and Windows Vista SP2
Windows 10 changes the date and time formatting settings for some cultures. Of particular concern are seven cultures for three different regions:
- Norwegian BokmĂ„l (âNorwayâ and âSvalbard and Jan Mayenâ variants)
- Serbian (variants âCyrillic, Kosovoâ, âLatin, Montenegroâ, âLatin, Serbiaâ and âLatin, Kosovoâ).
For these seven cultures, Windows 10 changes the date and time separators to be the same. For example, in Finnish, the standard date format used to be 26.8.2015 21:08, while it is now 26.8.2015 21.08 â note the subtle change in the time separator.
Note that any change in separators can cause your code to break â for example, if you have written a parser that imports time-of-day values by relying on certain culture settings, you may find that your parsing logic no longer works. But the fact that the time separator changed to be the same as the date separator creates an issue far more insidious and severe.
In all currently released versions of .NET, the DateTime.Parse method has a shortcoming: It always fails to parse a date or a date+time combination in a culture where the date/time separators are the same character. This bug, together with Windows 10âs culture changes, breaks the previously hard rule of DateTime.Parse always being able to parse the cultureâs default DateTime representation. Now,
DateTime.Parse(DateTime.Now.ToString()) no longer works under the described conditions. Neither does
DateTime.Parse(DateTime.Now.ToShortDateString()), which is somewhat counterintuitive since the changed time separator isnât even involved, but true nonetheless â the parser thinks itâs parsing a time instead of a date.
If you own the callsite to DateTime.Parse, you can use ParseExact (see an example below in the workarounds section) and avoid the issue. In fact, using ParseExact is a best practice anyway, if you know the format youâre parsing from. But the real is problem is that almost all of .NET Framework relies on the assumption that default datetime representations survive format/parse roundtrips â and thus, many parts of .NET donât use ParseExact.
The most problematic failures will occur in databinding and modelbinding code. You will find that WPF, WinForms and Modern (UWP) Apps will fail when binding dates to controls. ASP.NET MVC will fail on modelbinding from form input. It will simply be impossible to enter a valid date. Depending on your scenario, your application may throw FormatExceptions, your DateTime fields will always show errors and refuse to accept input, or your modelbound ASP.NET input will be empty.
The fix (and what about previous .NET or Windows versions?)
Microsoft has fixed this issue in a forthcoming update to .NET 4.6. This patch is planned to ship in Windows 10âs September update. DateTime.Parse will be improved to deal with a scenario where the two separators are the same. The separators wonât change back, and the new separators may still break your code in other scenarios â but thatâs not a Windows or .NET Framework bug.
How about previous versions of Windows? Since it was Windows 10 that introduced the new separators, you wonât hit this on older versions of Windows. Technically you could – if your system was running a custom locale with date/time separators set to the same character, but in practice custom locales are rare, and such custom locales probably nonexistent. At the moment, there are no plans to patch this separately for older Windows versions.
How about previous versions of .NET Framework? First off, you will not see this if youâre running an app on a version older than .NET 4.0. That is because .NET 4.0 is the first version of the Framework to use the OS culture data â versions before that carried their own regional settings, and they donât contain cultures where date and time separator is the same character.
If you can wait until the update comes out, you donât need to read the rest of this blog post. If you canât, Iâll give you some ideas on what to do.
As I already stated, if you have an explicit DateTime.Parse call, replace it with ParseExact. That is fairly straightforward to do if you can special case the mentioned 7 cultures. For example, for a Finnish date-only field, you can do
DateTime.Parse(stringToParse, "d.M.yyyy", CultureInfo.GetCultureInfo("fi-FI")).
For all the other scenarios, you need to apply the previous code fixâŠ The trick is knowing where to do it.
- For Windows Forms data binding, you need to set the Binding.Parse event to a logic that special cases the said cultures. For a big app, the problem is getting that set up for all the relevant bindings in your application. You might want to explore reflection or similar strategies for doing programmatic injection of custom parsers for all DateTime fields. If you already have custom parsers in place, good luck marrying these two together.
- For WPF data binding, you might want to embed this into an IValueConverter (see tutorial). Again, finding all the necessary injection spots may be a headache.
- For ASP.NET MVC, you need to write a custom ModelBinder. The nice thing here is that you are able to apply this binding easily to all DateTime/Nullable<DateTime> objects by just adding your binder to the ModelBinders collection in your application startup. However, it is important to remember that the default behavior for ASP.NET MVC is to use InvariantCulture for parsing HTTP GET request parameters but to use the CurrentCulture for HTTP POST parameters (more here).
- If you are at the deployment end of the stick (i.e. trying to get those .NET apps to run on computers you administer), you can turn the custom locale approach to your benefit and create a locale that has separate date & time separators. You need to use Locale Builder for that â and yes, it works on Windows 10 even though the blog post only talks about 8.1. Of course, this doesnât help you as a developer if you cannot enforce the custom locale on your customers.
If you come across other scenarios or have solutions to share, add a comment. Please let me know if you have other feedback / questions / clarifications as well. Thanks!
On Friday, I created the Finnish Microsoft Developersâ mailing list. Yes, mailing list, and I knew I was going to get whacked about it. I guess itâs time to do some explaining.
First off, thank you to everyone who sent me feedback. Even if youâre calling me a freaking luddite from the 90s, I appreciate that you care.
Iâll start by pointing out the key facts:
- I know how to use Yammer. I use it every day, and I have more networks than fingers.
- I think Yammerâs features would benefit the Finnish developer community.
- I initially wanted and intended to make this a Yammer network.
The decision to go with the mailing list was mine. I couldnât poll the community with ease, becauseâŠ well, we donât have that community, at least yet. Instead, I took input from a non-trivial number of people around the world who run lists, forums and Yammer networks like this. But thereâs no substitute for the local voice, and the discussion has now been opened. Iâm happy to take input.
It all boils down to a single question: What is the best way to gain maximum engagement? This question needs to be answered per community. Even if Yammer works for your company or the Microsoft Preview Program you happen to be in, it doesnât imply universal acceptance.
Using email causes the YOS (Yammer-Office-Sharepoint) crowd to laugh out loud, while a more nuts-and-bolts-type developer may well treat Yammer as a superfluous crap over simple, time-tested protocols such as email. As I pondered the needs for Finnish developer community, I felt that it was more important to get a lowest-common-denominator approach that would alienate the least amount of important people.
And in this sense, I do feel that the Enterprise Social Network minded information worker consultants are a less-important minority, and more pressingly, also one that is currently better served by other communities. Furthermore, I want to be extra careful not to alienate the group of people who are just entering the Microsoft focus: groups such as Node.js developers and Linux devs just using Azure as a platform.
Put bluntly: Iâd rather get the big masses from the trenches than try to cater to those who are semi-professional community builders by trade already (and yes, I believe being a Yammer fanatic sort of implies this).
Is Yammer really good?
At the very least, I like its features. I would love to have a great attachment story. Having online polls would be nice, and being able to âlikeâ stuff instead of posting pesky +1 emails isâŠ well, modern. But, if you have nobody to attach images to, or nobody to answer your polls, or nobody to ever hit the like buttonâŠ Well, who cares?
On the other hand, Yammer isnât all greatness. For example, use of groups tends to split the discussion. That works when youâre a busy enterprise or a 5000-strong network of naturally chatty Microsoft MVPs, but try doing that in the traditionally quiet environment of Finnish software developers. Split up a low-volume discussion, and you have just reached split silence.
In many ways, Yammer could be fixed. It would just need to have a better notification system, better client software, better authentication model, better support for multiple networks… It would also help if I could make the network public â one of the better aspects of the current Google Groups implementation is that the message contents are indexed for public searchability.
After all is said and done, I think it most importantly comes down to this: Almost no Yammer networks outside the immediate corporate sphere survive, unless they have a strong host presence and a sense of urgency to them. It is possible to make a Yammer community thrive, and I respect all the Yammer Community Managers out there. But you also need to realize that I have nowhere near the reach or resources of an intra-company community manager. Either the list has low barrier to entry and it quickly proves itself useful to those who join â or it fails and disappears.
I chose email over Yammer because my faith in Yammerâs ability to hold people after the initial rush of excitement isnât very strong. Yammer may well be the best technology in a sense, but the best technology doesnât always win. Yammer is fairly good at resolving issues that arise in an active community â so far I donât trust its worth in fostering a community where one doesnât already exist.
I am not certain about my choice of technology, but a choice had to be made, and a mailing list felt (and still feels) best. It wonât satisfy everyone, but nothing would.
I have no idea if this community will thrive. Odds are overwhelmingly against success (come on, doesnât âFinnish developer online communityâ sound like an oxymoron from the get-go? ;-)). It is possible to switch over to Yammer at some point if we want to â but I doubt that will change our fate.
Again, thanks for your feedback. I hope the above clarifies my thinking on the subject. Feedback is still welcome (in any language).
This blog post is written in Finnish, because itâs all about being a Finnish Microsoft Developer. In summary, there is now a mailing list for us.
PitkĂ€n pohdinnan jĂ€lkeen olen perustanut suomalaisille Microsoft-sovelluskehittĂ€jille postituslistan. Toiveenani on kerĂ€tĂ€ listalle reipas joukko osallistujia suomalaisesta .NET-kehittĂ€jĂ€kentĂ€stĂ€, ja saada luotua siitĂ€:
- Keskustelupaikka. MeillĂ€ ei ole sellaista foorumia, jossa kehittĂ€jĂ€t voisivat heittĂ€Ă€ kysymyksiĂ€ ja keskustelunavauksia. Ulkomaisia foorumeita on vaikka miten paljon, mutta kielikynnys on yhĂ€ olemassa, ja toisaalta paikallisilla vastauksilla on monille enemmĂ€n arvoa.
- Kohtaamispaikka. Hyvin monet suomalaiset sovelluskehitysorganisaatiot â joissa varsin monessa tulee vuositasolla kĂ€ytyĂ€ â kĂ€rsivĂ€t siitĂ€, ettei heillĂ€ ole kontaktia tai kĂ€sitystĂ€ muiden tekemisestĂ€.
- Ideoiden lĂ€hde. Maailmalla tapahtuu paljon mielenkiintoista, ja sitĂ€ voi seurata TwitterissĂ€ ja ties missĂ€ uutisvirroissa. Mutta millaisia asioita Suomessa oikeasti tehdĂ€Ă€n, ja ketkĂ€ niistĂ€ tietĂ€vĂ€t? Lista tarjoaa mahdollisuuden nostaa esiin myĂ¶s suomalaista osaamista.
Miksi 90-lukulainen postituslista eikĂ€ esim. Yammer-ryhmĂ€Ă€? TĂ€tĂ€ on pohdittu paljon. Oma uskoni on, ettĂ€ postituslistan kaltaisella hyvin yksinkertaisella vĂ€lineellĂ€ on helpompi tavoittaa oikeat ihmiset. Yammerin kĂ€yttĂ¶Ă¶nottoon liittyy monilla kynnys, ja vaikka sitĂ€ voi seurata sĂ€hkĂ¶postitsekin, harva tulee tehneeksi niin. Monet erityis-Yammer-ryhmĂ€t ovat kuihtuneet hieman samaan tapaan kuin LinkedIn-yhteisĂ¶t.
Onko postituslista sitten oikea vĂ€line, ja ottaako porukka tĂ€mĂ€n omakseen? En tiedĂ€, sen nĂ€yttĂ€Ă€ vain aika.
Miten liittyĂ€ listalle?
Msdevfi-lista on toteutettu Google Groupsilla. RyhmĂ€Ă€n pĂ€Ă€set allaolevasta linkistĂ€.
Voit lukea listaa joko selainkĂ€yttĂ¶liittymĂ€stĂ€ tai sĂ€hkĂ¶postin kautta. Webbifoorumitse lukeminen on triviaalia, mutta jos et ole koskaan tilannut Google Groupia sĂ€hkĂ¶postiisi, tĂ€ssĂ€ ohjeet.
RyhmĂ€n etusivu nĂ€yttĂ€Ă€ tĂ€ltĂ€:
Klikkaa âLiity ryhmĂ€Ă€n, jos haluat kirjoittaa viestinâ-nappia.
Kaksi keskeistĂ€ asiaan liittyvĂ€Ă€ asetusta nĂ€kyvĂ€t yllĂ€ kehystettynĂ€. EnsinnĂ€kin, huolehdi ettĂ€ âJĂ€senyydellesi kĂ€ytettĂ€vĂ€ sĂ€hkĂ¶postiâ on sellainen osoite, johon haluat sĂ€hkĂ¶postit. Huomioi, ettĂ€ et voi myĂ¶skĂ€Ă€n lĂ€hettĂ€Ă€ listalle postia muualta kuin tĂ€stĂ€ osoitteesta. Jos haluamaasi osoitetta ei ole valittavissa, lisĂ€Ă€ se ensin Googlen tiliasetuksista.
Toisekseen, sĂ€hkĂ¶postin lĂ€hetystiheys valitaan pudotusvalikosta. Alimmainen valinnoista tarkoittaa sitĂ€, ettĂ€ lista kĂ€yttĂ€ytyy kuin postituslistat yleensĂ€. âAlle 1 pĂ€ivĂ€ssĂ€â on vain Googlen arvio viestiliikenteestĂ€ â kuvakaappausta otettaessa lista on tyhjĂ€, joten arvio on siksi tĂ€mĂ€. Toivottavasti keskustelu vilkastuu, ja jatkossa valikossa nĂ€kyy isompia lukuja. Voit toki halutessasi tilata viestit vain koosteinakin, mutta silloin sĂ€hkĂ¶posticlientista vastaaminen kĂ€y huomattavasti kĂ¶mpelĂ¶mmĂ€ksi.
Tervetuloa listalle, ja toivottavasti siitĂ€ on teille hyĂ¶tyĂ€!
Did you Finns realize that there's a vibrant open data ecosystem growing in Helsinki? The global Open Data Day on 21st February also had a local manifestation: Open Data Day Helsinki Hackathon.
Whether you're deep in the bowels of an enterprise or busily hacking your next social media startup, it's easy to miss the public sector opening up. But at the Helsinki hackathon, there was definitely a buzz in the air. Here's a short recap of the key things I learned.
The Helsinki region cities publish a lot
There's an abundance of open data sets available from Helsinki Region Infoshare. Many of the data sets are static and updated only periodically, but some are also real-time APIs. For example, you can get an 15-megabyte Excel spreadsheet for every purchase made by the City of Vantaa. Or, you might treat yourself to an KML map data file of school regions in Helsinki. Or, getting more realtime, perhaps you'd fancy a real-time XML API of the location of snow plows in Helsinki?
The new tone of the Helsinki is "Open Source, Open APIs, Open Data, Open Government". To that extent, there is a group of developers within the city, and they run a site called dev.hel.fi which acts as a point of contact between the city and developers using its APIs. They also frequently arrange events that allow you to meet other developers to discuss specific topics: for example, the next event will be about environmental issues, and will have introductions from various authorities and stakeholders.
The Open Source part manifests itself as a GitHub account for City of Helsinki.
And if you want a broader reach, the 6Aika project is working to unify APIs and open data methodology between the six largest cities of Finland (Helsinki, Espoo, Vantaa, Turku, Tampere, Oulu).
Journey Planner v.Next: Getting there, with more variety
The Helsinki Regional Transport authority HSL has offered Reittiopas, a journey planner for the Helsinki region public traffic, for years. It is about to be reincarnated: the next version is planned to be released for preview during the next summer. It will be based on open APIs â and will be implemented as open source.
The HSL dev site contains a lot of information on what's happening next, but the essence is this: there's already a lot of data available through open APIs. During 2015, there will be more real-time GPS data available. Also, supporting services such as parking areas near train stations will get their own data set: in the future, you will be able to check out if there are parking spots available at the time you plan to be there.
Even healthcare can be radically open
Janne KĂ€Ă€riĂ€inen from Futurice presented some analysis of results from opening up the healthcare data silos in the United Kingdom. I didn't find the extremely shortly presented analysis convincing enough to blog about it, but Janne's presentation opened my eyes to see how extreme openness can get.
For example, the UK society of cardiothoracic surgery (in layman's terms: group of doctors who cut hearts and lungs) publishes death risk charts per surgeon. As the diagram to the right shows, if you happened to be operated by a gentleman called Qamar Abid (a randomly picked example), you would statistically be in good hands: his in-hospital mortality rate is below the national average.
Once we get this much data out in the open, the possibilities are endless.
Using Microsoft tooling
Microsoft was also one of the sponsors of the event, and the dynamic duo of Pasi MĂ€kinen and Drazen Dodik took the audience on a quick half-hour whirlwind tour of relevant Microsoft technology. I don't think the details of Microsoft's offering are much news to the readers of this blog, but here's a short recap of all the things Microsoft pushes in this space:
- "Use the tools you know": Microsoft supports everything from native Windows tools and HTML5 to cross-platform tools such as Unity, Cordova and Xamarin.
- BizSpark provides technological and financial support for startup companies.
- Azure provides out-of-the-box support for open data frameworks such as CKAN.
- API Management can help with publishing your data sources. Azure Marketplace Datamarket can provide a selling/distribution channel.
But whether or not you decide to go for Microsoft tooling, there's definitely plenty of things to be done. Enjoy!
Starting February 2015, I am a Microsoft Regional Director (RD). Needless to say, I am honored and happy for the award.
The Wikipedia RD article linked above doesn't paint a fully accurate picture of the program (at the time of writing), but the gist of it is correct: Regional Directors are not Microsoft employees, but rather an unpaid group of 130 senior technology professionals who are charged with the task of advancing communication between Microsoft and developers using Microsoft technology. And, starting from 2015, RDs' reach will extend from just developers to some broader categories of IT Professionals as well.
The legendary Ahti Haukilehto has traditionally been the Finnish RD â for the last two decades or so. He still is, but alas, only until the end of June 2015. I am glad we have a few months of time together, and will work hard to be a worthy replacement.
What does this mean in practice?
I will devote a considerable slice of my time to this. I will blog and speak more, and I will be more available for discussions. How exactly will all this happen, I don't know yet â there are a lot of good options, and I need some time to think about this first.
I have some preliminary plans. They include visiting some Finnish Microsoft partners in order to better understand their business and needs for Microsoft. I will also have more in-depth discussions with various product managers and specialists at the local Microsoft subsidiary to understand where they stand with their products. With this, I hope to get an understanding of what's going on in "my region".
Once that is done, the next steps remain to be seen. I will have frequent contact with the product groups at Redmond, and will hopefully be able to turn all this into some benefit for the Finnish Microsoft community. There are a lot of possibilities, ranging in scope from simple (bringing new people together in projects) to maniacal (a Finnish technology conference ĂĄ la NDC or Oredev).
I will release more details as my plans progress and the future becomes clearer. In the meantime, if you think there's something a Microsoft Regional Director would be able to help you with, get in touch with me. I look forward to energizing the future, and the Microsoft of 2015 is most definitely an excellent platform for that.
Expect more stories on this blog.
If youâre a Finnish developer interested in Microsoft technology and tools of the trade in year 2013, you should join us at local Microsoft premises on 16th December for a free user group session.
We will cover a broad array of topics. Right now, the agenda looks like this:
- An overview of the VS2013 wave, including recent changes such as Visual Studio Online and Xamarin co-operation
- The new web developer stuff in VS2013, specifically VS Web Essentials
- Visual Studio Online ALM features: tasksmanagement, testing and automated builds
- Visual Studio Online Version control: Git vs. TFVC, version control as a deployment tool
- Katana and the OWIN stack: Running ASP.NET outside the IIS stack
- Visual Studio Online âMonacoâ and App Insights
Registration is open! If you can understand Finnish, can you miss this?
You may not want your connection strings embedded in your web.config in plaintext â that will expose your database credentials to all the world. This blog post shows how to secure the connection strings. It is ugly, but quite doable.
Note: SQL Azure team has posted a four-part series on this in September 2010. Some details have changed since, and this post aims at being more practical, shorter and easier to follow. Iâll also discuss the common problems with the method. But feel free to read their version as well :-)
What do I need to do?
.NET Framework supports encryption of configuration elements per configuration section. Thus, youâll probably end up encrypting your whole connectionStrings section. This is likely to be just OK for you.
There are the following stages to the process:
Letâs go through these steps one by one.
Creating a certificate
You will need to create a certificate that is used as the encryption key. First you use this certificate to encrypt the configuration, then your site uses it to decrypt the configuration. To achieve your security gaols, you should make sure that the certificate and its password are safely stored. Ideally, your developers would never even see it.
Open up a Visual Studio Command Prompt (for example, a âVS2012 x64 Cross Tools Command Promptâ, âDeveloper Command Prompt for VS2012â or similar), and type the following. âMyApp Connection Stringsâ is just a name for the certificate, and the rest of MyApps are file names. There is no magic in those names.
makecert -r -pe -n "CN=MyApp Connection Strings" -sky exchange "MyApp.cer" -sv "MyApp.pvk"
You will need to type a password for the certificate (thrice). Keep it safe, because youâll need it. Next, youâll need to merge the .cer and .pvk files into another certificate file format .pfx. Do this with the following command line:
pvk2pfx -pvk "MyApp.pvk" -spc "MyApp.cer" -pfx "MyApp.pfx" -pi "password"
Put the password you set earlier into the argument value for the âpi switch.
Importing the certificate locally
To be able to encrypt the configuration, you must import the certificate on a workstation. Tap Windows-R and run mmc.exe to open up the Management console. Add the Certificates snapin by using File > Add/Remove Snapin and enabling Certificates snap-in. Select the âComputerâ account in the next dialog, and âLocal computerâ in the next one.
Once you have added the snapin, open the Personal store, right-click to open the All Tasks menu and choose Import.
Use the import wizard to import the certificate. By default, the file browser filters to *.cer files, which contain only public key information useful for encrypting the configuration information. The default options for the following dialogs are fine.
Under normal circumstances, you should only distribute the .cer file to people who need to encrypt the configuration. The .pfx (and .pvk) files contain the private key that can be used to decrypt the configuration â that should be kept behind locks and only installed into Azure. But for practical purposes, you may want to install the .pfx at this point so you can actually test the thing before pushing it out. If you do import the pfx, you also need to enter the password you chose earlier.
Adding the encryption provider
The pfx file represents the certificate format known as PKCS #12. To use the certificate in your application, you need to write a class that supports said encryption format and still works in Windows Azure. Microsoft actually provides one, but it comes in a Visual Studio 2008 solution. You can just get the relevant class file from here: Pkcs12ProtectedConfigurationProvider.cs.
Add the class into your solution. You can put it wherever you want, but the place must be precompiled into an assembly (i.e. an MVC project, class library, just not ASP.NET App_Code or other runtime compilation spots). You may need to add some assembly references (System.Configuration, System.Data, System.Security, System.Xml) to make it work. Change the namespace to something you want to see in your application: you will need the full type name later on.
Adding the certificate to your solution
In Solution Explorer, open the Roles node under your Cloud Service Project and open the Properties of your web role whose configuration you want encrypted. Choose Certificates and Add the Certificate. The default store location of âLocalMachineâ and store name of âMyâ are good. The Name field defaults to something like âCertificate1â, but that is not important.
Click the ââŠâ button in the Thumbprint field to select the certificate. Youâll get the dialog seen on the right.
Once you have picked the certificate, you will get a thumbprint value that is a hex string (â64DC1BB08D4E993D6D2A20BB543079F55968F4F4â). This value identifies your certificate, and in order for your next cloud publishing to succeed, a certificate with this thumbprint must already be uploaded to the Azure.
Encrypting your configuration
First, set up your configuration encryption provider by adding the following section to your web.config. Remember that if you have a configSections block, it must be the first block in the configuration file. In fact, encrypting your configuration deletes your configSections if it isnât the first section â thatâs probably a bug.
<configProtectedData> <providers> <add name="Pkcs12Provider" thumbprint="64DC1BB08D4E993D6D2A20BB543079F55968F4F4" type="MyApp.Utils.Pkcs12ProtectedConfigurationProvider, MyApp"/> </providers> </configProtectedData>
Remember to replace the exampleâs thumbprint with your own certificateâs thumbprint you got from the previous step. If you lost it, just open the Certificate properties page for your Web Role under the Cloud Service project. Also, make sure the type attribute contains the full namespaced name of your provider and the assembly filename after the comma (i.e. âMyApp.Securityâ for MyApp.Security.dll).
Now it is time to actually encrypt your connection strings, which is the ugly part. The encryption is done by issuing the following command in the Visual Studio command prompt, while in the same directory as your web.config. The parameters you pass in are the name of the configuration section to be encrypted, the path of the configuration file and the name of the provider.
aspnet_regiis -pef "connectionStrings" "." -prov "Pkcs12Provider"
But donât do it just yet. If you do, you will get an error on the lines of âCould not load file or assembly âMyAppââ, referring to the assembly that should contain the provider.
Note: If you run this in a directory that doesnât have a web.config to encrypt, youâll get an error message stating âThe protection provider âPkcs12Providerâ was not found. Strange error, but check your current working directory.
Aspnet_regiis will look for the specified assembly from a very limited set of lookup paths. First and foremost, it looks for the assembly in the GAC. Since it is typically cumbersome â and even bad practice â to register your application DLLs in the GAC, you shouldnât do that.
Microsoftâs canned solution (to which I linked above) works around this by wrapping the configuration provider in a separate DLL, which is then strongly named. The solution also contains an installer project that puts the provider into the GAC. But since the installer project doesnât build in new versions of Visual Studio, that is fairly clumsy. That said, if your organization already has a tool DLL that is typically deployed into GAC, you can include the provider in that assembly. If you do that, everything should just work (but of course, you need to refer to the assembly with its full name and all the attributes).
So, the next issue: How to make aspnet_regiis find your DLL without pushing it into GAC?
The easiest approach is to copy your DLL into the directory where aspnet_regiis.exe resides, typically C:\Windows\Microsoft.NET\Framework\v4.0.30319 or something similar. You end up putting your binaries behind .NETâs internal ones, which isnât very nice, but it works.
If you want avoid some of that mess, put the DLLs in a separate directory as outlined in a StackOverflow answer. Unfortunately, the directory must still reside under aspnet_regiis.exeâs location. And you can go further with features such as DEVPATH, but I wouldnât bother unless you really need to.
Once youâve resolved the DLL question and ran aspnet_regiis successfully, your configuration file should get properly encrypted. If you did install the .pfx into your certificate store, you can now also test this by running your application: if everything works, your application most obviously can access the encrypted configuration.
In case the Pkcs12ProtectedConfigurationProvider throws an ArgumentNullException with the parameter name of keyObject when youâre accessing the configuration, you probably have not imported the private part of the key (i.e. you chose the .cer file when importing).
Note that you donât have to modify your callsites for accessing the configuration at all: .NET Framework ConfigurationManager and other classes take care of invoking the decryptor when needed.
Uploading the certificate to Azure
The only thing left to do is to push the certificate to Windows Azure. Open the Azure Management Portal, choose your Cloud Services and get over to the Certificates tab.
Click the Upload button on the bottom of the screen to choose your certificate, choose your .pfx file and enter your password.
Note: If you upload the .cer file, Azure will accept it, but you will get the (already mentioned) ArgumentNullException for parameter keyObject when trying to access the configuration.
Once youâre done with this, publish your new package to the cloud. If everything went as planned, you should now have a working solution.
Practical tips and tricks
There are quite a few ways to manage the encryption process in real development scenarios. For most teams, it is worth encrypting only the production Database (and Azure Storage) access credentials.
The easiest way to this goal would probably be this:
- Designate an IT pro, lead developer or another liason who is allowed to know the production credentials.
- Have the trusted person install the public key (.cer file) on his computer, making him able to encrypt the credentials.
- Make your version-controlled web.config have the development configuration; typically plaintext is just fine here.
- Once the trusted person has created an encrypted connectionStrings section, put that in a config transformation file with the xdt:Transform=âReplaceâ approach. You will also need to use xdt:Transform=âInsertâ to add the configProtectedData section.
- You can now freely check in the config transformation as well.
- As is normal with encryption keys, store the private key (.pfx) and its password carefully, preferably in separate locations.
Finally, here is an example of a transformation file that handles the described scenario.
<?xml version="1.0"?> <configuration xmlns:xdt="http://schemas.microsoft.com/XML-Document-Transform"> <connectionStrings configProtectionProvider="Pkcs12Provider" xdt:Transform="Replace"> <EncryptedData Type="http://www.w3.org/2001/04/xmlenc#Element" xmlns="http://www.w3.org/2001/04/xmlenc#"> <EncryptionMethod Algorithm="http://www.w3.org/2001/04/xmlenc#aes192-cbc" /> <KeyInfo xmlns="http://www.w3.org/2000/09/xmldsig#"> <EncryptedKey xmlns="http://www.w3.org/2001/04/xmlenc#"> <EncryptionMethod Algorithm="http://www.w3.org/2001/04/xmlenc#rsa-1_5" /> <KeyInfo xmlns="http://www.w3.org/2000/09/xmldsig#"> <KeyName>rsaKey</KeyName> </KeyInfo> <CipherData> <CipherValue>A4K8W5Hr...</CipherValue> </CipherData> </EncryptedKey> </KeyInfo> <CipherData> <CipherValue>G5cpXKvbY...</CipherValue> </CipherData> </EncryptedData> </connectionStrings> <configProtectedData xdt:Transform="Insert"> <providers> <add name="Pkcs12Provider" thumbprint="64DC1BB08D4E993D6D2A20BB543079F55968F4F4" type="MyApp.Utils.Pkcs12ProtectedConfigurationProvider, MyApp"/> </providers> </configProtectedData> </configuration>
Have a nice encryption!
If youâre in Finland next week and speak Finnish, thereâs a good reason for you to come over to local Microsoft premises on Thursday 13th afternoon.
Offbeat Solutions and Kompozure will be organizing a free user group seminar on practical Azure experiences. I will personally give an overview of Azureâs PaaS and IaaS offerings, which will then be followed by presentations on the following topics:
- Building web applications on Azure
- What does a developer need to learn when moving to Azure?
- Automated deployment into the cloud
- Windows Azure Media Services
- Windows Azure Mobile Services
If youâre interested, check out the agenda and register at http://sanko-azure.eventbrite.com/. Welcome!