August 2005 - Posts

After more than 8 hours looking at progress bars, my brand new VPC with beta stuff is ready. It contains:

  • Windows XP SP2 + Platform SDK
  • Office 2003 SP1
  • .NET Framework 1.1
  • .NET Framework 2.0 Beta 2
  • WinDbg 6.5
  • Internet Explorer 7 Beta 1
  • Visual Studio 2005 Team Suite Beta 2
  • SQL Server 2005 CTP
  • Microsoft Shell Beta 1 (MSH aka Monad)
  • WinFS Beta 1 + SDK
  • WinFX Beta 1 (Avalon, Indigo) + Visual Studio Extensions + SDK

Time to clone the VPC harddisk image (about 16 GB used disk space) and start playing with all this stuff. You'll see posts about all this exciting stuff later on over here. Stay tuned! | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Read it on the WinFS Team Blog: it's here! MSDN Subscribers can grab the bytes on the subscribers downloads website. I've just installed it and will start playing with it later this week. Note it only runs on Windows XP with SP2, not Windows Server | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Looks a promising successor for the "Writing Secure Code 2nd Edition" which every serious developer should have on his/her bookshelf (after reading it of course :-)). More information on Amazon where I just ordered the book.

This *really* was the last blog post before my (OOF) holidays. I'll be offline till August 28th. Keep up the good work and stay tuned! | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

A couple of months ago I did a post on "SHA-1 also insecure?". SHA-1 is one of today's hashing algorithms (hash function) that's widely used. Simply stated, a hash is a one-way function H that takes a variable-length message M as a parameter and returns a fixed-length hash value h, i.e. h = H(M). The idea of such a hash function is to produce a fingerprint that represents the original message, e.g. to be used in "authentication" of a message (checking a message's integrity against tampering for instance). Such a hash algorithm needs to meet three properties:

  • One-way: it should be computationally infeasible to compute message M given the hash value h.
  • Weak collision resistance: given a message M1 it should be computationally infeasible to find another different message M2 such that H(M1) = H(M2).
  • Strong collision resistance: it should be computationally infeasible to find a pair of different messages (M1, M2) such that H(M1) = H(M2).

This week (Crypto 2005 conference in Santa Barbara, California), two papers have been published by Xiaoyun Wang from Shandong University (China) about finding collisions in SHA-0 and SHA-1 (well, these were already published before but have now been presented at the conference):

The newly announced results (not yet in the papers) show that an attack of complexity 263 is possible and it's likely that the complexity is still going to drop. With a complexity of 263 finding SHA-1 collisions becomes feasible. | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

The introduction

Oh my god, I couldn't stop to work the night before I leave for a short one-week holiday :o. But hey, there are holidays for, isn't it? :d Let's turn serious now. Last week I've been spending my free time on writing some security-related demo scripts, including the XP SP2 and W2K3 SP1 security boost features. One of these is the Attachment Execution Service (AES). The idea is to isolate potentially unsafe attachments in order to prevent these to harm your computer. Attachments stand for mail attachment but also for files that were sent using instant messaging.



First, let's point to the API on MSDN: IAttachmentExecute. The sample which is included with the API documentation shows the basic workings of this interface. The idea is that a browser, mailclient or IM tool calls the various methods of this interface in order to execute operations such as saving the attachment (method Save) or executing the attachment (method Execute, pass in the "action" to be taken, e.g. open or print or edit or ..., depending on the file type registration parameters in the registry). I'm currently writing a piece of sample code on how to use this interface which I'll try to post later on.

How does this help to enhance security you might wonder? The answer can actually be found in the API. For instance, take a look at the method Execute. You should find the following remark:

IAttachmentExecute::Execute may run virus scanners or other trust services to validate the file before executing it. Note that these services can delete or alter the file.

IAttachmentExecute::Execute may attach evidence to the local path in its NTFS alternate data stream (ADS).

Look for similar remarks on the method Save:

IAttachmentExecute::Save may run virus scanners or other trust services to validate the file before saving it. Note that these services can delete or alter the file.

IAttachmentExecute::Save may attach evidence to the local path in its NTFS alternate data stream (ADS).


The demo

What's interesting is the NTFS alternate data streams (ADS; KB article 105763) part. Here's a demo walkthrough scenario:

  1. Download the LADS tool from Frank Heyne to display NTFS alternate data streams. You might find the following resource interesting if you're not familiar with the alternate data streams feature of NTFS (which was originally added to NTFS because of Macintosh compatibility reasons): Extract the lads.exe executable in some folder, say c:\temp

  2. Start, Run, cmd, OK

    >cd %userprofile%\My Documents\My Received Files

    Note: This assumes you're using MSN Messenger and you've already received a file from one of your contacts. Alternatively, you can download a file using Internet Explorer 6 (XP SP2, W2K3 SP1) and perform the operations above on the directory where the file was saved.

  3. In the output of the lads.exe run you should find stuff like this:

    \My Documents\My Received Files\1.pdf:Zone.Identifier

  4. The Zone.Identifier is the alternate data stream associated with the file during the attachment saving through IAttachmentExecute. Let's view this ADS using Notepad (replace 1.pdf by one of the files you've found):

    >notepad 1.pdf:Zone.Identifier

    You should find something like this:


    Which is indicating the file comes from zone 4 (somewhere on the internet that is).

  5. Now find some executable somewhere on your system which was not downloaded from the internet. To boost our level of geekiness during the demo we'll go to a Visual Studio .NET 2003 Command Prompt, open up Notepad (optional, see further ;-)), write a "Hello World" console application in C# and invoke csc.exe.

    Setting environment for using Microsoft Visual Studio .NET 2003 tools.
    (If you have another version of Visual Studio or Visual C++ installed and wish
    to use its tools from the command line, run vcvars32.bat for that version.)

    C:\Documents and Settings\bartds>cd \temp

    C:\temp>C:\temp>echo class Hello { public static void Main() { System.Console.WriteLine("Hello World"); } } > hello.cs

    C:\temp>csc hello.cs
    Microsoft (R) Visual C# .NET Compiler version 7.10.6310.4
    for Microsoft (R) .NET Framework version 1.1.4322
    Copyright (C) Microsoft Corporation 2001-2002. All rights reserved.


  6. Open up Windows Explorer and go to the temporary folder used in the previous step (say c:\temp). Locate the hello.exe executable and run it. It should work fine without a single warning.

  7. Switch back to the command prompt window and do the following:

    C:\temp>notepad hello.exe:Zone.Identifier

    Notepad will scream "Cannot find the hello.exe:Zone.Identifier file. Do you want to create a new file?" which you'll answer by clicking Yes. Enter the following two lines


    hit File, Save and close Notepad.

  8. Switch back to Windows Explorer and run hello.exe again. It should still not issue a warning. Ough! However, when you close down Windows Explorer and open a new Windows Explorer instance for c:\temp you should not be able to execute the hello.exe file:

  9. Open the file properties for hello.exe and you'll notice the Security warning at the bottom of the dialog. Click Unblock. Now the file will execute correctly.

  10. Switch back to the command prompt window and try the following (again):

    C:\temp>notepad hello.exe:Zone.Identifier

    Again, Notepad will scream "Cannot find the hello.exe:Zone.Identifier file. Do you want to create a new file?" which indicates that the Unblock operation in Windows Explorer has removed the :Zone.Identifier ADS on the file. Answer Notepad's warning message with Yes and enter the following:


    Hit File, Save and close Notepad.

  11. Close the existing Windows Explorer instance and open a new one, pointing to c:\temp. Now try to open the hello.exe file. Instead of blocking the execution as with ZoneId=4 in step 7-8, you'll now get a warning:

  12. Uncheck the "Always ask before opening this file" checkbox and click Run. The executable will run fine. Switch back to the command prompt window and try the following (once again):

    C:\temp>notepad hello.exe:Zone.Identifier

    Once again, Notepad will scream "Cannot find the hello.exe:Zone.Identifier file. Do you want to create a new file?" which indicates that clearing the "Always ask before opening this file" checkbox in the Security Warning dialog has removed the :Zone.Identifier ADS on the file. End of demo.

I'll try to post some code showing the IAttachmentExecute interface's functionality later on. If I find some time after my holidays, I'll blog about my adventures with DEP (see next screenshot) demo stuff too.

Going to bed now in order to be alive and kicking on my holidays :d. | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

To keep the number of nonsensical posts on its level, I decided to add a favicon.ico file to the root of my website and my blog website. You should see my favicon on your MSN Search Toolbar tabstrip (okay, it's not art with a big A, but at last I have such an icon up and running :-)).

How to do this? Just put a favicon.ico file in the root of your website folder and you're done. For the .Text blog pages you need to copy the favicon.ico file to the blogs root folder and you have change the web.config file in order to serve .ico files too, as follows:

<?xml version="1.0" encoding="utf-8" ?>
      <section name="BlogConfigurationSettings" type="Dottext.Framework.Util.XmlSerializerSectionHandler, Dottext.Framework"/>
      <section name="HandlerConfiguration" type="Dottext.Framework.Util.XmlSerializerSectionHandler, Dottext.Framework"/>

   <HandlerConfiguration  defaultPageLocation = "DTP.aspx" type = "Dottext.Common.UrlManager.HandlerConfiguration, Dottext.Common">
         <HttpHandler pattern = "(\.config|\.asax|\.ascx|\.config|\.cs|\.csproj|\.vb|\.vbproj|\.webinfo|\.asp|\.licx|\.resx|\.resources)$" type = "Dottext.Framework.UrlManager.HttpForbiddenHandler, Dottext.Framework" handlerType = "Direct" />
         <HttpHandler pattern = "(\.gif|\.js|\.jpg|\.zip|\.jpeg|\.jpe|\.css|\.ico)$" type = "Dottext.Common.UrlManager.BlogStaticFileHandler, Dottext.Common" handlerType = "Direct" />

One last thing for now. I'll be on holidays till August 28th, so my blog won't be updated next week. See you soon and keep the feedback coming :-). | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

I like this one very very much. It even knows my city (Zottegem, Belgium) and redirects to the nearest place with known weather data (Munte, Belgium about 10 km from my home). It even tried to specify our local city "communities" and guess what ... the toolbar knows them (it's using the MapPoint web services in the background to find the specified places, see map):

(click to enlarge)

You can download the add-in over here: | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Get it as fast as lightning from The new features are mentioned on For some excellent tutorial of WinDbg-aided "Application debugging in a production environment" check out Hans De Smaele's MSDN Belux article (125 printed pages; PDF format) on Enjoy both the new release and the tutorial! | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

Tonight I was doing a little demo prep for security-related stuff in Windows Server 2003. One of the things I was preparing is the "service account credentials" problem. As you probably know, Windows services run in the context of a user (just like any other process does). Typically such a user is SYSTEM or Network Service or Local Service but it can also be a custom account specified by the user during installation of the service or via the services.msc MMC. The use of a custom account allows you to have better control over what the service is allowed to do (privileges, ACLs) and therefore can lead to better security (assuming you're not adding that account to the administrators or power users group of course, e.g. "for simplicity of configuration", oooh). Now, when you do specify such a custom service account, you also need to specify the password for that account. Windows needs to store that password somewhere locally on your machine, being the registry key HKLM\SECURITY.

Now your natural reaction should be: WinKey-R, regedit, My Computer, HKEY_LOCAL_MACHINE, SECURITY. You'll be disappointed I guess, nothing visible in there. Check the permissions of the key and you'll see that only the SYSTEM account has access to this key. Okay, you could go to the Permissions dialog window and grant the Administrators temporarily access to the key in order to look at the contents, but instead of doing this, I'd like to show another trick. Also notice that this sentence contains a very bad assumption because it implies that you are running as Administrator on the machine. However, for the sake of this demo I'm breaking the rules. Now the trick. Go to the command prompt and type the following commands:

C:\Documents and Settings\Administrator>net start Schedule
The requested service has already been started.

More help is available by typing NET HELPMSG 2182.

C:\Documents and Settings\Administrator>time /T
04:13 AM

C:\Documents and Settings\Administrator>at 04:14 /INTERACTIVE cmd.exe
Added a new job with job ID = 1

After (less than) a minute you should see a new Command Prompt window appearing in your interactive session. In fact, it won't have C:\Windows\System32\cmd.exe in its title bar, but you'll see C:\Windows\System32\svchost.exe instead. What's so special about this Command Prompt is that it's running in the context of the Task Scheduler service's account, being SYSTEM:

nt authority\system

When you now start regedit through this prompt, it will inherit the security context of the parent process, being the cmd.exe shell hosted by svchost.exe running as SYSTEM. You can check this as follows:

C:\WINDOWS\system32>tasklist /FI "USERNAME eq NT AUTHORITY\SYSTEM" /FI "IMAGENAME eq regedit.exe"

Image Name                     PID Session Name        Session#    Mem Usage
========================= ======== ================ =========== ============
regedit.exe                   4680 Console                    0      3,360 K

That should convince you, right? Now back to the Registry Editor where you can now open the HKEY_LOCAL_MACHINE\SECURITY\Policy\Secrets key because SYSTEM has access to this key.

In this very key Windows stores a lot of secrets that need to be kept in plaintext for further usage, including service account passwords. The store includes the machine's computer key in a domain (HKEY_LOCAL_MACHINE\SECURITY\Policy\Secrets\$MACHINE.ACC), security information of various services (HKEY_LOCAL_MACHINE\SECURITY\Policy\Secrets\_SC_*), Terminal Services Licensing keys (HKEY_LOCAL_MACHINE\SECURITY\Policy\Secrets\L$TermServ*), ASP.NET auto-generated machine keys (HKEY_LOCAL_MACHINE\SECURITY\Policy\Secrets\L$ASP.NETAutoGenKeys*), RAS dial parameters (HKEY_LOCAL_MACHINE\SECURITY\Policy\Secrets\RasDialParams!S-1-5-21-99659525-1695905762-1439541511-500#0, the -500 suffix of the SID stands for the local Adminstrator accounts), cached password hashes for the last logons, web user passwords, FTP passwords.

Take a look in here carefully. From the list mentioned above, the _SC_* keys are the most interesting ones to us. For example, I have MIIS installed on my machine which runs as a service called "miiserver" with a custom service account. In the registry I can find the following key - HKEY_LOCAL_MACHINE\SECURITY\Policy\Secrets\_SC_miiserver - containing security information for the service. Originally, Paul Ashton reported the LSA secrets problem on the Windows NT BuqTraq mailing list in August 1997 (link) with a basic exploit. A first thing I wanted to show is the fact that this code doesn't work anymore. However, the SYSTEM account still has access to the (obfuscated) data in some way, as the SCM needs the information in order to run the service correctly.

The moral of the story is pretty straightforward: a compromised local administrator can lead to a domain-wide attack by extracting the service account passwords that possible are domain accounts. Having such a domain account can be enough to launch a successful attack against the domain and other systems (servers, workstations) in that domain.

One tool that can be used nowadays to extract this information from the LSA Secret is the lsadump2 tool from Bindview. The code is included with the tool and gives you better insight in how this tool works (lsadump2.c). In big lines, the tool tries to find the lsass.exe (the local security authority subsystem) process, tries to enable debugging privileges (needed to run the tool), injects a dll into the process (dumplsa.dll compiled from dumplsa.c) and uses named pipes to communicate with the dll's functionality in order to print output to the screen. So basically, the tool hooks itself into the LSA in order to retrieve information directly from the source. The dumplsa.c file contains the source of the LSA dumping functionality (refer to the DumpLsa function which iterates through the HKLM\SECURITY\Policy\Secrets key and calls the LsarOpenSecret and LsarQuerySecret functions through direct function pointers obtained by GetProcAddress on the lsasrv.dll library). Try to run the tool yourself and be convinced of its power!

However, some things went wrong earlier this night. The tool - running on W2K3 SP1 - brough the lsass.exe process down, causing some weird system behavior. First of all, I got the well-known shutdown countdown dialog that says "This system is shutting down. Please save all work in progress and log off. Any unsaved changes will be lost. This shutdown was initiated by NT AUTHORITY\SYSTEM (...) Time before shutdown: 00:00:60" with the additional message: "The system process 'C:\WINDOWS\system32\lsass.exe' terminated unexpectedly with status code .... The system will now shut down and restart.". When you see this dialog, Start, Run, cmd, OK, shutdown /a, ENTER as fast as you can to cancel the system shutdown.

Also check out Mark Russinovich's article on "Running Windows With No Services" that was published in July this year. It has a screenshot of this "System Shutdown" countdown dialog.

The death of lsass.exe is just the beginning of a series of troubles in fact. After I saved my work, I tried to reboot the computer through Start, Shut Down, Restart, OK resulting in the following error message:

Right, lsass.exe is responsible for user privileges indeed and the privilege to shutdown the system can't be evaluated anymore for the current user. Damn. What about shutdown /s at the command prompt? Doesn't work either as you can see in the following screenshot:

Just in a little panic, I tried to play the at.exe trick to try the shutdown.exe stuff as LOCAL SYSTEM. However, at.exe doesn't work anymore either:

The Task Manager can't find out about the process owners anymore as you can see below:

And whoami is out of order too:

So I tried to logoff which worked well. However, pressing CTRL-ALT-DEL results in a screen with only the mouse pointer left. No chance to login again. Powering off the system seemed to be the only solution left. Upon reboot, the system told me what I already knew: I killed "LSA Shell" :o:

My conclusion: Security demos are fun, but don't prepare these on a production machine (you laptop for day-to-day usage for example) in order to reduce scary moments. And never ever trust 3rd party (only 3rd party?) tools to work smoothly. Oh, did I tell you that I was running all this stuff in a VPC? :-) | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks


In this post, I'll cover the Windows Vista Secure Startup feature to present you one of the big security enhancements made in the Windows code-named Longhorn wave. To go short, Secure Startup is a new security feature in Vista that addresses the concern of better data protection, based on hardware support.



So, what is all this Secure Startup stuff all about? Take a look back at my short definition of Secure Startup above. It's a security feature, it addresses data protection and it's based on hardware support.

Let's start with the latter one, the hardware support. Currently, almost all of the security-related stuff is handled by the operating system and by applications running on top of the OS in order to protect data which is stored somewhere on a machine. In the end, security is all about enforcing security rules and policies in order to make sure data and services are protected against malicious usage whatever form it may take (e.g. data disclosure). However, this approach of having the software dealing with all security-related stuff makes the system also vulnerable by its very nature. As the system itself has to store and retrieve keys for data protection (e.g. the global system key, abbreviated as syskey, in Windows), the information used to secure things is right there on the harddisk. Imagine the very actual risk of "stolen laptops". It's a simple task to retrieve the data which is stored on the machine using tools that can be found on the internet, no tricks involved. Check out Steve Riley's (blog) column on The Case of the Stolen Laptop as well.

Okay, there exist solutions to address this issue partially. For example, you could store encryption keys on a separate medium, e.g. some removable storage thing. However, it's still key to realize that you only improve the security of data secured by that very key. For example, it's wrong to think that storing the syskey on a floppy disk will enhance the security of the data stored on disk to a high extent. Data which was unencrypted before will still be unencrypted. You're only securing the data which is secured by the syskey. And of course, physical security still plays a very important role.

<Intermezzo subject="syskey">

A computer contains a bunch of critical security-related information nowadays. It's clear that this information needs to be protected against a broad variety of attacks, including information disclosure. This information includes the local Security Account Manager (SAM) database, Active Directory account information, IPSec keys, wireless network keys (WEP, WPA), computer keys (e.g. used for Kerberos), system recovery passwords (e.g. Active Directory restore mode), secrets managed by LSA (local security authority), SSL keys, EFS keys, etc. All this stuff protects something, but how is this stuff protected itself? The answer is that there are keys to encrypt the (private) keys which are stored on the system. These are called master keys, an example being a master key associated with a user in order to protect his/her EFS, SSL, etc keys. On their turn, these master keys are also encrypted by a kind of "root key", called the syskey. I've been blogging about this thing earlier on as I'm a syskey-addict :-).

By default, the syskey is stored on the computer itself and is randomly generated during Windows Setup. This information is then spread across the registry (called "scattering") in a pattern which is unique for your Windows installation ("obfuscation"). Using the syskey.exe tool included with Windows in the system32 folder, you can change the way the syskey is stored or derived. A first option is to store the key on a floppy disk. You simply can't boot the pc without the floppy and you can't retrieve secured information without having the key (but remember that unprotected data physically stored on the harddisk is not protected). Another mode enables you to enter a system boot password which is used to derive the master key from. Check out the syskey.exe tool but do it with care.

Syskey can help against information theft when combined with EFS. The use of EFS encrypts the data on disk whileas syskey encrypts the keys used to encrypt the data. By moving the system key off the harddisk using syskey.exe, e.g. to a floppy or by using a password, security is enhanced in the "case of the stolen laptop (or whatever machine)". The syskey password mode is the safest of the three available modes as there exist cracking tools to extract the syskey from the registry (no, I won't be posting links to these tools :p).


Another example is the use of EFS, the Encrypted File System, in Windows 2000, XP, 2003. With EFS (a NTFS-related feature), a user can encrypt his/her data which is stored on disk, completely transparent for further usage. However, this only secures users against each other on a multi-user system. It does not secure you against a stolen computer from which the EFS keys are restored in order to decrypt a user's data. And there's also a threat that comes from recovery agents that should be trustworthy people.

<Intermezzo subject="EFS">

Let's talk about EFS a little further. EFS stands for Encrypted File System and was introduced in the Windows 2000 operating system family as an add-on device driver for the NTFS driver (from XP on, it's merged into the NTFS driver itself). What it does is providing a transparent way to encrypt/decrypt files and folders on the level of the file system. (Read: EFS is all about data confidentiality through encryption, not about integrity and protection against tampering.) What I do mean by the word transparent is the fact that you don't need a password in order to access a file. EFS is available on Windows 2000, Windows XP Professional and Windows Server 2003. Now the technical stuff.

As a side-note I want to mention that you should never encrypt single files, always encrypt entire folders. This is because of the fact that EFS creates a temporary plaintext "shred" of the file (called Efs0.tmp) in order to encrypt it and then copy it back to the folder where it belongs to. This shred can remain on the disk, leaving a potential attack open. The only effective way to solve this is to encrypt entire folders or to wipe the disk (but prefer the former one if you can).

First, a little word on how to encrypt a file or folder. In Windows Explorer, right-click a (file or) folder and go to Properties. On the tab General, click Advanced and mark the "Encrypt contents to secure data". That's it. Now, when another user tries to open the file (assuming he has read permission to the file) he/she will get an "access denied" error. Another way to encrypt/decrypt a file/folder is by using the cipher.exe tool with the /E and /D flags. More information can be found in the Windows Help and Support documentation. Geeks can also use the advapi32.dll's EncryptFile function that will call into the feclient.dll file that's handling file encryption stuff. The actual encryption/decryption is done by part of the LSA system which runs as SYSTEM, so impersonation and user profile loading is being done.

Now, how is this encryption done behind the scenes? It should be clear we do need a key in order to encrypt the file. Next, we do need an algorithm to encrypt and decrypt the data efficiently, which means we should use symmetric encryption. The former one, the key, is the so-called File Encryption Key (FEK) which is a key that's randomly generated on a file-by-file basis (when you encrypt a folder, you're in fact flagging the folder to encrypt every single file inside the directory). The latter one, the encryption algorithm, is DESX on Windows 2000. On Windows XP SP1 and later and on Windows Server 2003 there's also support for 3DES and AES. By default, Windows 2000 and Windows XP (pre SP1) use DESX whileas Windows XP SP1 and later as well as Windows Server 2003 use AES.

Note: You can force 3DES to be used as the encryption standard for all cryptographic services on the system by altering the local system policy (Security, System Cryptography, "Use FIPS Compliant Algorithms For Encryption, Hashing and Signing"). If you prefer to use the registry, go to HKLM\SYSTEM\CurrentControlSet\Control\LSA and set FipsAlgorithmPolicy to 1. To control the encryption algorithm for EFS only, go to HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\EFS and change AlgorithmID to 0x6603, which enables 3DES. Note that on Windows 2000, this won't work if the High Encryption Pack (separate floppy disk) isn't installed. Changing the encryption mechanism this way doesn't change the encryption for existing files, it only affects newly encrypted files.

To continue, what happens with the FEK? Well, it's just stored together with the file. In order to secure the encryption key so that only the owning user can decrypt the file, the FEK itself is encrypted using public/private key encryption (RSA) using the public EFS key for the user. If multiple users need to able to access the file (which is the case when using recovery agents - by default the administator is a recovery agent - for instance), there will be one encrypted FEK for each user (encrypted with that user's public key). All these encrypted FEK instances are stored in the so-called Data Decypher Field of the file.

This mechanism provides a high stack of encryptions. At the very bottom line we have the system startup key (syskey, see above). On top of that, there's a master key for the user. Next, we have the EFS key for the user. And last but not least, there is a separate key for each encrypted file. You might wonder where the private keys for the user live and how these are protected. The private keys are stored in %appdata%\Microsoft\Crypto\RSA\<usersid> and are encrypted by the user's master key, which lives in %appdata%\Microsoft\Protect\<userid> (check on XP Pro computer) and is encrypted based on the password of the user. If you change your password, the master keys are reencrypted automatically. However, if your password is being reset, this does not happen (it simply can't because there is no way to use the old password for decryption first) and so you loose your information if you don't have a recovery agent in place.

Other sources where you can find useful EFS-related information are:

Some general EFS-related guidelines include:

  • Use EFS in a domain configuration. On a standalone pc, you can break EFS using various tools because all the secrets that are part of the encryption/decryption are kept locally (unless you're using syskey in mode 2 or 3, i.e. floppy or password mode). Another possible attack is to clear the local Administrator password by removing the local SAM database in %windir%\system32\config\sam. As the local Administrator is a recovery agent by default, logging in as local Administrator will make it possible for you to recover the encrypted file. Syskey modi 2 and 3 solve this problem because you need the EFS keys which are stored by LSA in a so-called "secrets cache" which is physically protected by the syskey. In a domain, the domain admin is a recovery agent by default which establishes a "physical gap" between the EFS encrypted files on one system and the recovery agent stuff on another one.
  • Be aware of "interactive logon cache", a Local Security setting that enables you to cache the last n logons on the machine which is particularly interesting for mobile users that need to login to their computers when being on the road, without access to a domain controller. Using this cache, an attacker might be able to authenticate and decrypt a user's files.
  • Avoid encrypting single files, because of the shred backup file being made in efs0.tmp. Although this plaintext backup file is deleted after the EFS encryption has taken place, the data will still live on the harddisk and can be recovered by an attacker (e.g. by using the Support Tool dskprobe.exe). As I mentioned above, the cipher.exe /w can be used (it was actually implemented by Microsoft as a reaction on a Bugtraq post about this issue).

Finally, my personal advice is to use EFS in a domain environment and to consider to use syskey mode 2 or 3 to move the system key away from the harddisk.


Also notice the existence of so-called ATA passwords which don't have to do anything with the OS but can be used to secure all data stored on a harddisk. I won't cover this in further detail over here.

So, Secure Startup wants to provide a way to offload certain aspects of data protection to the hardware, but in an OS-controlled fashion. Very concretely, the Secure Startup feature uses a Trusted Platform Module (TPM 1.2) to protect the user data and to protect against tampering while the system is offline. Just like EFS, Secure Startup is transparent to the users. Basically what it does is encrypting the entire Windows volume.


Trusted Platform Module

A Trusted Platform Module is a microcontroller to store keys, passwords and digital certificates, and is typically sitting on the motherboard (of the upcoming generation of computers). TPM is "invented by" the Trusted Computing Group which is promoted by several computer companies, including Microsoft (see member list). TCG is a non-profit organization that aims to enhance the security of computing in general and was formed in early 2003, based on the work done by the Trusted Computing Platform Alliance (TCPA). TCG has impact on both hardware and software by delivering a standards proposals such as TPM. Detailed specification documents can be found on

Back to TPM in particular now. In order to understand the role of TPM, you'll need to have a basic understanding of what TCG calls the "Trusted Platform". What follows is based on the "TCG Specification Architecture Overview" specification. In order for a platform to be called trusted, it should provide three basic features:

  • Protected capabilities are a collection of commands that can interact with sensitive data that is "shielded" against malicious access. Such places are called "shielded locations" and include registers and places in memory where manipulations of sensitive data are guarenteed to be protected. The protected capabilities have exclusive access to these shielded locations. Samples include management of cryptographic keys and random number generators.
  • Attestation is all about the accuracy of information, which is an important factor in the trustworthiness of a platform. First there is attestation by the TPM which can be used to provide proof of data known to the TPM itself. Using a so-called Attestation Identity Key (AIK) internal TPM data is digitally signed. Secondly, there is attestation to the platform to provide proof of a platform's trustworhtiness to report integrity measurements. Hand-in-hand with attestation to the platform we have attestation of the platform, used to provide proof of a platform's integrity measurements also using an AIK which signs PCRs (Platform Configuration Registers, part of the protected capabilities). Last but not least there is the need for authentication of the platform's identity.
  • Integrity measurement, storage and reporting is the collection of integrity-related services to enhance and measure the trustworthiness of a platform. The integrity measurement part helps to obtain metrics which have impact on the overall trustworthiness of the platform, based on trusted states. Integrity storage is used to log integrity metrics in a safe manner, by digesting the contents and storing it in PCRs. Finally, integrity reporting is a form of attestation for the data stored in the integrity storage. The overall idea of this set of "services" is to have trustworthy evidence of the state a platform was/is in, to assist processes in evaluating these integrity states and taking appropriate actions upon that.

In order to provide this functionality, the TCG architecture requires components that need to be trusted for the full 100%, otherwise misbehavior can't be detected. These components are critical to the trustworthiness of the system and are called roots of trust. The collection of roots of trust in a system has to provide the functionality needed to describe the platform's characteristics that affect the trustworthiness of it. This includes:

  • Root of Trust for Measurement (RTM) to make reliable measurements of system integrity.
  • Root of Trust for Storage (RTS) for the secure maintenance of integrity digests.
  • Root of Trust for Reporting (RTR) for reporting of the RTS' contents.

Now, how is integrity measured in order to be stored and reported? To support this integrity measurement, there is a so-called measurement kernel component that generates measurement events, consisting of two components: a measured value and a hash digest of that value (e.g. SHA-1). Basically, these measurements and the corresponding digests are snapshots of the operational state of the machine. The digest is stored by RTS whileas the measured value can be consumed anywhere. In order to verify the measured data digest values are recreated and compared to the stored data. Sequences of related measurement event data are kept in a Stored Measurement Log in which a common measurement digest is used as the starting point. Newly aqcuired measurement values are appended to the common measurement digest and rehashed in order to preserve ordening. This process is called extending the digest.

Furthermore, the TPM can act as an endpoint of communication which means that it can be used to provide several security related services for secure exchange of data between systems, relying on a trustworthy identification of the systems involved in the communication. By providing key management support and configuration management, the TPM can help to improve security for communication between systems. In order to support this scenario, TCG defines four classes of protected message exchange:

  • Binding is the process of encrypting the to-be-transferred data by means of a public key of the intended recipient, which can only recover the message using the corresponding (non-shared?) private key.
  • Signing generates a signature for the message that can be used to validate the integrity of the message and is typically done by hashing the message and encrypting it using a (signing only) key.
  • Sealing is an extension of binding and also encrypts messages. Furthermore, it binds the message to a series of platform metrics that need to be fulfilled in order for the message to be decrypted. This binding binds the symmetric key used to encrypt the message (for speed) with a set of PCR values and the assymetric key. Sealing clearly improves trustworthiness of a platform by requiring a certain state for the platform to be in, before decryption can be done.
  • Sealed-signing links the signing operations with the PCR registers of the machine creating the signature. This allows a verifier to check the PCR contents included with the signed message in order to have a clear picture of the configuration of the signing platform configuration at the time of the signing process.

The protected storage component (RTS) holds keys and data entrusted to the TPM. Embedded in the TPM there are two (semi-)fixed keys, the Endorsement Key (EK, used to establish a platform owner) and the Storage Root Key (SRK, associated with the platform owner, can be replaced). Keys stored in the TPM are categorized into two categories: the migratable keys (which can be transferred to another TPM) and non-migratable keys (cannot leave the system). An AIK (see attestation discussion above) is a prime sample of a non-migratable key. TCG defines 7 key types which are dedicated to certain functionality. These include:

  • Signing keys - assymetric key, migratable or not migratable, signs application data and/or messages
  • Storage keys - assymetric key, encrypts data or other keys
  • Identity keys (AIK) - assymetric key, not migratable, used to signed data from the TPM (e.g. PCR register values)
  • Endorsment keys (EK) - decryption key, not migratable, used to decrypt owner authorization data when the owner of a platform is established, also used to decrypt AIK-creation related messages
  • Bind keys - encrypt/decrypt small amounts of information to be transferred across platforms, e.g. symmetric keys
  • Legacy keys - keys created outside the TPM (e.g. by the OS or an application), imported in the TPM and also migratable
  • Authentication keys - symmetric keys to protect transport of data where the TPM is involved in

The RTS interfaces with storage devices through a so-called Key Cache Manager which I won't cover over here. You can find more information in the "TCG Specification Architecture Overview" document mentioned above.

This brings us to the TPM Components, which I'll cover a little further. Keep in mind we're talking about a hardware component over here. The software aspect will be covered further on. A first important component is the I/O component that acts as an interface of the TPM component to the outside world. It's connected to the TPM's internal communication bus and routes messages to the right destination. Next, there are a couple of engines for cryptographic functionality, such as the random number generator, an engine for SHA-1 hashing, a key generation engine and and RSA engine. Further on, there is a piece of non-volatile storage that holds the EK, SRK and other owner-related data. On the field of memory, there are the PCRs too that can be either volatile or non-volatile. Another piece of storage contains the AIKs, but it's recommended by TCG to store these keys outside the TPM (nevertheless, there is reserved room inside the TPM component itself). Further on, there is program code living inside the TPM (which acts as the Core Root of Trust for Measurement or CRTM). The execution engine runs the program code. Finally, a so-called Opt-In component is used to enable/disable/deactivate various operations that the TPM is capable of providing.

For the operational states of the TPM component, I refer to the "TCG Specification Architecture Overview" document again.

On to the software aspect of TPM. At the very bottom we have of course the TPM component itself which is managed by the TPM Device Driver in the OS' kernel mode. On top of that we have the user mode, consisting of system processes and user processes. In the system processes space, you'll find the TCG Device Driver Library (TDDL) which provides an interface (TDDLI) to players higher in the stack. The TSS Core Services (abbreviated as TCS; TSS stands for TCG Software Stack) talks to the TDDLI interface and provides a TCSI interface to the user processes. In the user processes space, there's a service model called the TCG Service Provider or TSP which has an associated interface called TSPI. Finally, we end up with the application interacting with this TSPI interface thing. Detailed information about these interfaces can be found in "TCG Specification Architecture Overview". The TCG Software Stack Specification (TSS) is available over here as well as the header file (C).


Secure Startup

The Windows Longhorn Secure Startup feature acts on three different fields:

  • Data protection of offline systems - a non-connected offline system is safe against data theft because of encryption of user and system data (read: the entire Windows volume) including the hibernation file, the page file, system files, temporary files, user data, etc. All applications on the machine also benefit from additional (implicitly added) security for offline scenarios.
  • Ensurance of boot integrity includes tampering detection for monitored files. The system won't boot if tampering (during offline time) is detected.
  • Hardware recycling - as the TPM contains encryption keys and other encryption stuff, simply erasing its contents makes the volume useless and non-recoverable. This eliminates the need of disk sweeping to physically delete critical content.

The overall idea is to encrypt the entire Windows volume, therefore including the SYSKEY, without having the top-most encryption key on the harddisk but moving it to a hardware component (the TPM). When the system is booting, the keys needed to read the data from the harddisk to boot Windows are only released when no tampering is detected. This is done by unique measurements from multiple factors during system boot, which results in a digest. If someone has tampered with the boot system, the digest won't be the same and the tampering is detected. The use of TPM is completely transparent for the operating system during boot and further operation. Notice that the encrypted data on a protected partition is bound to a particular installation of Windows.

Let's go in somewhat more detail and look at the boot process with TPM enabled. First of all, there's a firmware security bootstrap mechanism that kicks in. This procedure is encoded in the TPM hardware and is in the end responsible to ensure system integrity, through the Core Root of Trust Measurement (CRTM). When the system is considered to be secure and trustworthy, a series of measurements is done to create a fingerprint of the system. At boot time, the same measurements are made in order to check the system for integrity. This mechanism is called Static Root of Trust Measurement (SRTM) and makes sure the encryption key for the system volume is only unsealed when the integrity verification process succeeds. The measurements I've referred to are stored in the PCRs of the STM component and include the following: firmware, option ROMs, step-by-step measurement of the boot process up to the BOOTMGR level, a value measurement of the boot-time events that occur (OS boot time). The volume encryption is based on block encryption.

In order to run Secure Startup in Vista, the system needs to match the following requirements:

  • TPM 1.2 should be available on the hardware and should be enabled (see operational modes in "TCG Specification Architecture Overview"). Ownership should be taken as well (see further).
  • The Master Boot Record (MBR) of the harddisk should be modified by a version shipping with Vista that can talk to the TPM hardware (to collect measurement data etc).
  • Vista must be installed on an NTFS volume.

Notice that Vista also supports the use of EFI (extra link) with TPM. The classic non-EFI scenario is to have an MBR active NTFS partition that contains Vista. EFI is a more recent BIOS specification replacing boot loaders and especially created for trusted computing scenarios. The idea is to offload the system hardware preparation stuff to the firmware at boot time, freeing the OS from these tasks. TPM boot procedures can be supported by such an EFI BIOS.

Vista supports TPM through the Windows Security Center where various management tasks are available. First of all there's of course support to enable Secure Startup on the machine. This is where the encryption is started after reboot. Next, there's support to disable Secure Startup with the option to retain encryption (which moves the key to a floppy or to the harddisk also with support for a password - compare to syskey modi) or to remove encryption (decryption kicks in). To support recovery an administrator can choose to store a recovery key on removable media or to supply a password for recovery. Recovery is needed when the hardware changes (old harddisk in new computer with different TPM), when the system is tampered with, data corruption of the MBR or the system, debugging, offline system updates. In all of these cases, the measured data does not match the original gathered data and the boot process is blocked till a recovery password or medium is supplied by the end user. These passwords and keys for recovery can be stored in Active Directory, enabling enterprise-level scenarios.

With this, I want to conclude a brief overview of the Secure Startup technology based on TPM in Windows Vista. I want to stress that Secure Startup is not the only feature of Vista that leverages the power of TPM. Other places include WMI support for TPM administration tasks, Group Policy support for TPM, a Key Storage Provider (KSP) that works with TPM and the exposure of the TSS interfaces to 3rd party applications. More information can be found via the links mentioned below.


Links | Digg It | Technorati | Blinklist | Furl | reddit | DotNetKicks

More Posts Next page »