Top
About Me

I write code for Microsoft and do web stuff. Want to know more? Read this blog!

My Social World

Buzz
Blogroll
Monday
Apr162012

gitstorecred - Store HTTPS Credentials for Git

UPDATE: Thanks to some help from Paul Betts this has been updated to use some new APIs in Git. Check out the blog post on the GitHub Blog for more info!

As you probably know, the ASP.Net team has moved to CodePlex for all of our out-of-band frameworks (MVC, WebPages, WebAPI, etc.). You probably also know that we are using the Git source control system on CodePlex. You may also know that CodePlex only supports the HTTPS transport for Git, which means that a username and password is required every time you push changes to the server:

You can bypass the first prompt by adding your remote with “https://username@server.com/path”, and can even bypass the second by using “https://username:password@server.com/path” as the URL, but that’s a really Bad Idea™. So, like most modern systems, Git probably has a way to store these credentials, right?

Nope.

However, it does have an interesting extension point called “AskPass”. If you set the “core.askpass” config setting, or the GIT_ASKPASS environment variable, to the path to an executable, Git will run that executable every time it needs a username or password. It will pass the prompt it would have written to the console as the command line arguments to this program and it expects the program to write the credential it is requesting (username or password) to the standard output stream. We can use this to our advantage!

GitStoreCred is a small tool I created for doing exactly that. It allows you to replace the Git username and password prompt with a standard Windows credential dialog AND it will store those credentials in the Windows Credential Store (if you check the “Remember” checkbox). Let’s see it in action:

Much better! Once we enter our credentials, everything works and Git pushes happily to the server. Let’s take a look at the Windows Credential Store now (this is in your Control Panel, go to User Accounts then Credential Manager):

See the highlighted entry? That’s my credentials for CodePlex (you’ll see I’ve also got credentials for GitHub there). If I try to push again, git won’t prompt me because gitstorecred just found the existing credential and outputted it.

Enough talking, gimme gimme gimme!

Ok, so you can download gitstorecred from GitHub and you can view the code there too. Just put the exe anywhere and set the GIT_ASKPASS environment variable to it’s full path. I use my PowerShell profile script to do it, since I’m always in PowerShell.

WARNING! HERE BE DRAGONS!

This is super hacky, I literally bashed it up over this weekend. It works, but I make no guarantees. The major known issues are the following:

First, if you type the wrong password, you have to go to Credential Manager (above) and either edit or just remove your credential. Gitstorecred has no way of getting feedback from Git so it doesn’t know that your password was wrong (for example) and will just keep sending the wrong password. If you checked “Remember” then the only way to get it to prompt you again is to remove the credentials.

Second, it doesn’t seem to get in the way of SSH-based pushing, but it might so be careful. If you ever find Git hanging when you run “git push”, just clear out the GIT_ASKPASS setting and try again. It’s entirely possible gitstorecred isn’t responding the way Git expects.

Third, we only store one set of credentials PER SERVER. That means that if you have two user accounts for some reason (say “anurse” and “bnurse”) for some server (say “github.com”), gitstorecred just stores the first one as the credentials for “github.com” and passes them along. Due to some issues with the Windows Credential API I use, it can’t easily key the credential off your user account. For that reason, I highly recommend adding your git remote WITHOUT the user name (i.e. “git remote add https://git01.codeplex.com/aspnetwebstack”). That way gitstorecred will handle the username and password and you won’t get in to weird issues. Of course, the credential store is per windows user, so if you have two separate users on the same machine, they can have different credentials for the same server.

Please do try it out though, the worst that should happen is it won’t work and you’ll have to turn it off. I’m happy to take pull requests to try and make it more powerful/robust (for example, there are better Windows Credential APIs I could use but they are harder to access from Managed Code). With a tool like this, and the fact that Git is much smarter about HTTP pushing now, I’m switching almost exclusively to HTTPS for my Git pushes.

Enjoy! Here’s the links again: Download EXE, Source Code on GitHub.

Friday
Apr132012

What else is new in Razor v2?

In my last post I discusses some of the big new features in Razor v2. In this post I’m going to talk a bit about some of the other (admittedly smaller) new features. So, let’s get right to it!

Void Elements

The HTML spec defines a certain type of element called a “Void Element” as

[An] element whose content model never allows it to have contents under any circumstance

— W3C HTML 5 Spec section 4.3

Put more simply, a void element is a type of element that can NEVER EVER have contents. It can come in three different forms:

  <!-- Self-closing -->
  <input name="foo" /><p>A different tag</p>
  <!-- Closed -->
  <input name="foo"></input><p>A different tag</p>
  <!-- Unclosed -->
  <input name="foo"><p>A different tag</p>

Any of the following is considered valid HTML5. However, in Razor v1, we only allowed the first two, because we had a much simpler parser. In Razor v2, you can now use the third form as well. This works because if a void element’s start tag is not self-closed AND is not IMMEDIATELY followed by an end tag (whitespace is allowed) then it is considered closed at the “>” of the start tag. So in Razor, when we parse a void element and reach the “>”, we look ahead and check if we see “</[tagname]>” (we allow whitespace between the start and end tags). If we do NOT see it, we consider the tag closed. So this means that if you typed the following inside a code block (i.e. @if() {})

<p><input name="foo">Some content</input></p>

Razor would end the markup block at the “</input>” tag. Why? Because the input element was closed by the “>” of it’s start tag, so the “</input>” tag is considered an extraneous end tag. Since it has no matching start tag (remember the “<input>” is already closed), we think that it belongs to a start tag Razor can’t see (because it’s outside of the code block this markup is within or even in a different document) and we end the block there.

For the most part, this shouldn’t affect you adversely, since syntax like the sample above is invalid HTML, but please let me know if you end up in an edge case where this is occurring in legal HTML.

Finally, what are the elements HTML5 considers void? The spec lists them off for us and Razor uses this exact same list:

area base br col command embed hr img input keygen link meta param source track wbr

— W3C HTML 5 Spec section 4.3

Syntax Tree and Internals Overhaul

NOTE: This part is going to dive in to parser internals a bit. Feel free to skim ;)

The last thing I’m going to talk about is a behind-the-scenes change that’s mostly relevant to people hosting the Razor parser. In order to support these exciting new features, we had to overhaul our parser internals and syntax tree structure. Razor now uses full HTML, C# and VB tokenizers and the parse tree gives you access to that granularity. For example, given the code:

@foo.bar.baz

In Razor v1, this would be two Spans (a type of parse tree node), one for “@” and one for “foo.bar.baz”. Each span would contain the string pulled from the input file. However, in Razor v2, we produce the same two spans, but now each span is a collection of Symbols. In this case, the first Span contains one symbol “@” and the second Span contains 5 symbols (“foo”, “.”, “bar”, “.”, “baz”). This allows us to perform more advanced analysis of the input document without have to reparse strings over and over.

We also reworked our higher-level Syntax Tree nodes, Span and Block. In Razor v1, we broke the file into chunks called spans and used sub-classes of the Span class to mark their type (for example: MarkupSpan, CodeSpan, HelperHeaderSpan, TransitionSpan). In v2, we removed all of those sub-classes, moving to an “Annotations” model. Going back to our previous example, in Razor v1 we would have produced a TransitionSpan and a ImplicitExpressionSpan (a kind of CodeSpan). In v2, both are concrete instances of Span, however they have various properties which attach annotations to control how they behave. For example, each Span has a CodeGenerator annotation which indicates how we generate C#/VB code from the node. They also have an EditHandler annotation which indicates how the editor should behave around this Span. By doing this, we (and even you if you want!) can add new syntax without having to dramatically overhaul all the various pieces of our infrastructure.

In future posts, I’m going to use some of this information to show you how to create a new kind of directive that works at runtime AND design-time (i.e. in the Editor).

Conclusion

Well, that’s basically it for Razor v2. It’s not a super-long list but believe me it was a lot of work. In many ways, Razor v1 was our “hey, check out this cool language!” release. Razor v2 has primarily been about internal clean-up and future-proofing (not that one can ever be totally future-proof). We added some cool little features, but the work done to the underlying structures should make it easier for us to add even more features and extensibility points in the future.

Enjoy Razor 2, Web Pags 2 and MVC 4!

Tuesday
Apr102012

What’s new in Razor v2

I know, it’s been too long since I blogged, and I won’t even bother saying I’ll try to blog more (though I will) because hopefully you’ll see that for yourself. A lot has happened since I last blogged but the biggest thing is that the ASP.Net Web Stack (which is what we call the out-of-band stuff we ship, like MVC, WebPages, WebAPI and Razor) is now truly an open-source project! We’re hosted on CodePlex (http://aspnetwebstack.codeplex.com/) and if you want more details, check out Scott Hanselman’s awesome post about it.

Now that we’re open, I’ll be able to start talking a little more directly about what we’re doing. It’s very exciting for us, and to alleviate your worries that open-source means Microsoft isn’t working on it anymore, we’re still working full-time on our current release and even starting a bit of planning for v.Next.

All that aside, we also released a new version of Razor since I’ve blogged and I thought I’d give you a quick run-through of the features. In later posts, I’ll go over the details of what’s changed as well as some new information for those hosting Razor outside of ASP.Net on how to take advantage of those features.

~/ – Url resolution made easy, goodbye @Href/@Url.Content!

One of the most common patterns in MVC Views is this:

<script src=”@Url.Content(“~/Scripts/myscript.js”)”></script>

Well, in Razor v2, you can express the same intent with much less code, and it looks much cleaner too:

<script src=”~/Scripts/myscript.js”></script>

Note the lack of any code transitions! How does this work? It’s quite a simple algorithm: Whenever we see an attribute value that starts “~/”, we treat it like a URL and replace it with a call to @Href (in WebPages) or @Url.Content (in MVC). Note that this is ANY attribute, so if you typed the following:

<div class=”~/Foo”>

We’ll treat “~/Foo” as a URL. We made this choice because we didn’t want to limit you to a list of attributes that we think have URLs. Especially when you might want to put URLs in a data- attribute. Imagine using a data- attribute to tell your client-side javascript what the root of your application is:

<html data-root=”~/”>

Then you can use jQuery to access this data: $(document).data(‘root’) and use it when making Ajax calls to make sure your app is portable even if it’s in a sub-folder.

What if you want to inject code snippets in to your URL? Well that’s easy, just treat it like you would if you were injecting code in to other attribute values:

<a href=”~/MySite/@Foo/Bar/@Baz”>Something!</a>

I should note that we’re actually not doing much special here, the code above is equivalent to the following MVC code in Razor v1:

<a href=”@Url.Content(“~/MySite/”)@Foo/Bar/@Baz”>Something!</a>

So we are just resolving that first portion of the URL and then going back to regular string concatenation for the rest.

Conditional Attributes

The other major feature (there are a few others I’ll go over in later posts but they are smaller) is Conditional Attributes. I’ll freely admit we borrowed this feature heavily from the fantastic Spark View Engine written by our very own Louis Dejardins.

Have you ever typed code like this in Razor v1?

@{ var cls = GetClass(); }
<div id=”foo”@if(cls != null) { <text>class = “@cls”</text> }>

If not, let me explain why you’d want to do this. In Razor v1, null was treated the same as an empty string so if you were to have code like this:

<div id=”foo” class=”@cls”>

Then if cls was null, we’d render

<div id=”foo” class=””>

Eugh! That looks ugly! In Razor v2, the same code would render this:

<div id=”foo”>

Note the missing class attribute? We’ve even taken away the leading space! Another feature of this is that we’ll also collapse whitespace within the attribute value:

@{ string foo = null; string bar = “bar” }
<div id=”foo” class=”@foo @bar”>

Becomes:

<div id=”foo” class=”bar”>

We also special case boolean values. If the expression evaluates to false, we treat it the same as null. If it evaluates to true, we render out the attribute name again. This allows you to write code like the following for checkboxes:

<input type=”checkbox” checked=”@isChecked”>

If isChecked is true, we render checked=”checked”, if it’s false, we don’t render the checked attribute at all.

Finally, we do NOT treat String.Empty ("") like null in this case. If the expression evaluates to an empty string, we WILL render the attribute:

@{ var foo = ""; }
<div class=”@foo”>

Renders:

<div class=””>

The reason for this lies in the difference between null and String.Empty. Null indicates the complete absence of a value, whereas String.Empty is a value, a string of length 0.

In the currently released (Beta) version of Razor, we do this for all attributes. However, in the next release (and, in fact, in the live code on CodePlex) we have entirely disabled the conditional attributes feature for data- attributes. You'll still get the ~/ URL resolution, but we won't remove null values or do fancy tricks with booleans in data- attributes. This was done because those attributes are usually read from JavaScript, so the semantics are different and we wanted to give you full control.

There’s much more!

That’s just a quick summary. I’ll be publishing another post soon with even more new features. Then, we’ll go in to how Conditional Attributes is implemented and how you can make sure your custom Razor hosts support this feature.

Please don’t hesitate to let me know what you think in the comments!

Wednesday
Dec212011

PS-Get 0.6 - Now with a Gallery!

I’ve been hard at work, and not just at my day job! It’s been a while but I’ve finally gotten together and created a proper website for PS-Get including….(drumroll please)…. a Gallery! Blatantly ripped off Borrowed from the NuGet Gallery Code, the PS-Get Gallery is a place to upload PS-Get modules for others to download. It also contains detailed installation instructions, but thanks to a new installer script, it’s a pretty simple install! To install, just run this command from a PowerShell 3.0 or PowerShell 2.0 (with .NET 4.0 workaround, see below) window:

(new-object Net.WebClient).DownloadString("http://install.psget.org")|iex

And then you’re all set! There’s rudimentary support for automatically injecting the proper code into  your Profile, but if you want to do it manually, all you need is Import-Module PS-Get.

So, what’s new in PS-Get 0.6?

Import-Package Cmdlet

The Import-Package cmdlet is a one-step cmdlet that takes the name of a module as it’s primary argument. It checks if that module is installed, and if not it installs it for you. Once the module is installed, it imports it in to the current environment. For example, consider the following code:

Import-Package PS-Json

If PS-Json is already installed, this is identical to Import-Module PS-Json. However, if PS-Json is not installed, it’s install first. This means that if you sync your profile across multiple machines, all you need is some code to check for and install PS-Get and a bunch of Import-Package statements to get all your Modules installed and ready! In fact, the PS-Get install script (which you can download here) takes parameters that let you skip the entire wizard. So you could have code like the following in your profile and have PS-Get and all your favourite packages installed automatically!

if(@(Get-Module –l PS-Get).Count –eq 0) {
    Write-Host "This is the first time you've run this profile on this machine so we need to install PS-Get first..."
    (new-object Net.WebClient).DownloadString("http://install.psget.org")|iex
}
Import-Module PS-Get
Import-Package PS-MagicStuffThatILove

Package Source Management

New in PS-Get 0.6 is the ability to manage multiple package sources. Using Add-/Remove-/Get-PackageSource you can manage package sources at three scope levels: Machine (global to all users), User (global to all PowerShell sessions for the current user), Session (specific to just this PowerShell session). These sources will all be searched for packages when Install-Package and Update-Package are used to get new packages. The installer script automatically sets up the PS-Get Gallery as a machine-level source. You can still override the source by using the “-Source” parameter of most cmdlets.

Export-Module Cmdlet

PS-Get 0.6 adds a new cmdlet “Export-Module” which can be used to take an existing PowerShell Module and export it to a NuPkg file which can then be uploaded to the PS-Get Gallery. If the module directory contains a NuSpec file, we’ll use that, otherwise will gather as much data as we can from the PSD1 manifest file (if there is one). Missing data can be filled in using arguments to the Export-Module cmdlet. Eventually, the plan is to make it easier to manage extra PS-Get-related metadata for Modules, either by augmenting the existing PSD1 manifests or by providing tools to manage NuSpec files for modules.

One thing this is missing right now is dependencies. If your module depends on another module, you’ll need to create a NuSpec file for it. Removing this restriction is priority 1 for enhancing the Export-Module cmdlet.

Conclusion/The Future

I hope you’ll try it out and start submitting packages to the gallery. It’s ready to go, but I’ll be perfectly honest that I’m doing this with no budget so I’m working on free/cheap hosting solutions and the site may go down.

If you want to help with the future of PS-Get, check out our GitHub page. Fork the code, file some issues on our tracker or just read the wiki!

Monday
May022011

Catharsis and Humanity

This is a rare non-tech-related blog post, and I hope you’ll forgive me for it. It’s also possible you may find it controversial, so I’d ask your patience in reading the entire thing before jumping to conclusions. There will likely be another non-tech post soon, since the Canadian Federal Election is wrapping up, but I needed to get this post out before then.

As everyone who isn’t living under a rock knows, on Monday, President Obama announced that US Special Forces had located and killed Osama Bin Laden. My girlfriend (who has posted something about this herself) brought up a quotation (attributed to either Mark Twain or Clarence Darrow depending on who you ask) which I think accurately sums up my feelings: “I’ve never wished a man dead, but I have read some obituaries with great pleasure”. There’s no question in my mind that Osama Bin Laden was responsible for terrible things, and his death has brought closure to many people. I certainly don’t fault the people in Washington DC and New York City for feeling a sudden urge to celebrate out of catharsis. One of the people I follow on Twitter, Stephen Toulouse, tweeted that “catharsis doesn’t choose a time to be appropriate” which really makes a lot of sense to me.

Despite all that, I feel… uneasy (for a significant lack of a better term)… about the way we are celebrating the death of another human being, as evil as he was (whatever “evil” truly is). I know that on September 11, 2001, there were people cheering in the streets for the death of human beings too, and I think it’s important to remember that as we go through this catharsis. Please do NOT get me wrong, I can fully understand the need to celebrate, it is human to have such a strong outpouring of relief and emotion when closure is found. For the people around the world who were directly affected by the events of 9/11, this will most definitely be an almost happy occasion, and I would not fault them for feeling that way. As I noted earlier, I myself am certainly not upset that Bin Laden is dead or even that people are celebrating that fact.

I think it is entirely natural to feel relieved and to want to express that relief, but now that the catharsis is over, we should rise above our enemies and remember that we respect all life, even those who feel so oppressed as to need to strike out violently. What Osama Bin Laden and al-Qaeda did 10 years ago was WRONG, full stop. However my moral code also tells me that any time a human being is killed by another, it is because of a failure in our society. I am still struggling to figure out what that means, since I too feel a lot of relief over the closing of this chapter in our history, but I think what’s most important is that we recognize this event for what it truly is: A purge of our emotions and frustrations which have been building up since 9/11, not a victory for humanity as a whole. There’s nothing wrong with celebrating the former, as long as you keep the latter in mind and reflect on what we need to do in order to achieve true peace.

Please don’t hesitate to post legitimate responses and discussions to this post below. I welcome disagreements. However, I will not hesitate to remove hurtful or hateful comments. Above all else, RESPECT each other, because that is how we rise above.

P.S. Apparently “catharsis” is my word of the day Winking smile