Managed meets functional

Blog about programming and having fun with .Net

About me

 Venice, 2009

profile for Alexander Galkin on Stack Exchange, a network of free, community-driven Q&A sites

Project Euler

Greetings here in my blog!
My name is Alexander Galkin. I was born 1979 in Kazan, Russia, where I graduated in child medicine.
Since 2001 I live in Hamburg, Germany and work as a freelancer software and database architect and trainer for Microsoft technologies.

 Microsoft Certified Trainer
Microsoft Certified Professional Developer
MCTS Logo
MCITP Logo

Calendar

<<  August 2014  >>
MoTuWeThFrSaSu
28293031123
45678910
11121314151617
18192021222324
25262728293031
1234567

View posts in large calendar

Neue Azure Features für MSDN Abonennten und Verwirrung darüber

In seinen letzten zwei Einträgen (eins, zwei) hat Scott Guthrie viele Änderungen im Azure MSDN Abo für Entwickler und Tester angekündigt. Die Ankündigung hat auch für Verwirrung unter Entwicklern gesorgt, da die Änderungen nicht nur neue Features, sondern auch gewisse Einschränkungen für MSDN Abonnenten mitbringen. 

Wenn man die Leistungen, die man früher über MSDN bezog, jetzt in Geld umrechnet, kommt man beim Premium Abo auf etwa €90, wobei nach der Umstellung wird einem nur noch $100 zur Verfügung stehen, was umgerechnet etwas weniger sein sollte. Hier muss man aber ganz am Anfang klar sagen, dass man die Kosten für Entwicklungszwecke deutlich rabattiert hat. Hier ist ein Bild aus dem Scott Gu Blog sehr informativ (sehe links).

Darüber hinaus muss man nicht mehr ein eigenes Testsystem im Unternehmen aufbauen, um die Software testen zu können – man kann einfach ein virtuelles Netzwerk in der Cloud anlegen und dort alles testen (vor allem die Integrationstest und was EDI und EAI angeht!). Das ist ein bekannter Problem bei vielen Unternehmen, dass man häufig nicht ausreichend Budget für ein vernünftiges Testbench hat und Testsysteme werden selten für mehr als 10% ihrer Kapazität genutzt. Zusammen mit MSDN Rabatt, MSDN Rights für Cloud Deployment (MSDN Lizenzen darf man auch in IaaS Szenarien verwenden) und der neuen minutengenauen Taktung beim Abrechnen soll Azure und vor allem das IaaS Angebot von MS, wo MS in der letzten Zeit viel punktet und Nase vorn hat, viel attraktiver für Entwickler und Tester werden.

Und was die Einschränkungen betrifft, ein MSDN Abo ist dafür gedacht, dass man Software für MS Systeme flexibel entwickeln kann, ohne die Kosten für einzelne Entwicklungs- und Testsysteme zu berücksichtigen und ohne dass diese Kosten zu Buche schlagen. Du kannst mit Deinem Abo vom 10 bis 15 Lizenzen pro Betriebssystem und Produkt haben, viele Entwicklerprodukte (wie Visual Studio, MS SQL Server etc) sind sogar „pre-pided“ und man kann theoretisch unbegrenzte Anzahl davon installieren und verwenden. ABER man darf unter keinen Umständen das MSDN Abo für die Produktion verwenden. Das heißt, solange die Software in die Produktion geht und richtig für ihren Zweck betrieben wird, darf man nicht mehr auf den ewigen Brunnen des Lebens zugreifen, sondern jetzt nur noch auf seine Tasche, um die Produktion-Lizenzen zu kaufen.

MSDN Azure Benefits war lange Zeit die Möglichkeit, einige Leistungen von Azure kostenlos quasi im Lieferumfang seines MSDN Abos zu bekommen. Die waren auch in der ersten Linie dafür gedacht, die Software Entwickler bei deren Cloudlösungen zu unterstützen, dass man möglichst wenig Kosten tragen muss, wenn man für Microsoft Cloud entwickelt. Das Angebot war aber so breit gehalten, dass man diese Benefits auch für kleine Produktionsumgebung theoretisch (und wohl auch nicht selten praktisch) ansetzen konnte.

Jetzt ist aber Schluss damit. MSDN Abo soll nach wie vor NUR dem Entwicklungszweck dienen und diese werden nach der Umstellung deutlich einfacher in der Cloud. Man wird jetzt dazu verleitet, zuerst sein Testsystem in die Cloud zu versetzen, wenn man sich noch nicht traut, auch die Produktion dorthin zu verlegen. Das ist in meinen Augen als Entwickler eine sehr gute und sehr positive Änderung, die mir ermöglicht meine Software zu testen, ohne dass ich noch Hardware dafür anschaffen muss. Die Einschränkungen wegen 120 Stunden sind für Test- und Entwicklungszwecke nicht so relevant, da Testen nie so lange in Anspruch nimmt bzw. wenn man wirklich viel Zeit braucht, kann man bestimmt noch davor mit MS reden und alles erklären. MS ist es nur wichtig, dass MSDN Benefits zweckmäßig benutzt werden, das ist alles.


Categories: Azure | german | SQL Azure
Permalink | Comments (0) | Post RSSRSS comment feed

Cursory look at Google Cloud through the eyes of Azure Insider

 

I must confess that I usually stick to Microsoft technologies let those be development or cloud. This is not good, but I don't often have enough time to look around and to compare other competitors. After all, my role is seldom to convince for or against certain technology, but often is to implement something concrete and deeper knowledge of one technology helps me more than superficial understanding of four rivaling ones.

But last week I was asked to give a short overview of the cloud offer by Google and it was really entertaining to make the comparison and to boil it down to an almost elevator pitch. I ended up with 8 slides which I shared via SlideShare.

The key slide is the following juxtaposition of Google and Azure services: I tried to reuse the iconographic of both Google and Azure to illustrate the services, so just a cursory glance should be enough to get the first impression. Those, interested in details are welcome to watch all 8 slides of the presentation by following the link provided above.

 


Categories: Azure
Permalink | Comments (0) | Post RSSRSS comment feed

Sample scenarios for Azure

Some potential scenarios for Windows Azure different services, from my answer on StackOverflow:

Azure Mobile Services cover the scenarios where you have multiple (mobile) devices running occasionally connected applications that need to synchronize their content through the cloud. AMS provides you with the possibility to implement the custom processing logic for data requests and updates; it hides the burden of implementing and hosting a web service. About 90% of the logic is set or written directly in the management portal, the rest is just the client logic. The main purpose of this service is data sync (this is the core functionality), all other services (authentication, logging, scheduler) are just auxiliary. The language used for development is JavaScript, the whole development is similar to server-side development using frameworks like Node.js.

Azure Web Sites is the way to host your code within IIS, that usually would be a web page, but nothing hinders you on hosting your web services (Web API based or even full-fledged WCF) here as well. Azure Web Sites are easy to deploy and this is a rather cheap solution for hosting web services, provided you allow other IIS applications (from other users) to run here as well (shared instance), but you can also prioritize your application by going for a reserved IIS instance (and pay more). Sure, you can reuse most (virtually all) of your existing business logic here (unless you need something exotic like interop or shell access that can't be hosted in IIS natively). The disadvantage of this solution is that your logic will run within the context of your web service and for long running processing this might be a non-optimal solution.

Azure Cloud Services allow you to defer the processing of logic rules and to decouple the logic from the service input. In this scenario you can have two kinds of roles, typically called web role and worker role. Web role provides endpoints for your services and queues the requests, the worker roles reads the queue and does processing. This allows you to fine tune your load balancing and capacity planning, increasing the number of parallel instances with web roles and worker roles.


Categories: Azure
Permalink | Comments (0) | Post RSSRSS comment feed

Sample databases for SQL Azure that support Entity Framework

While working on my tutorials for SQL Azure as backend for LightSwitch I bumped into the problem that the SQL Azure sample database AdventureWorks is not (fully) supported by the latest version of Entity Framework (EF5). This has the consequence that one cannot create the Data Context for the whole database that would be a prerequisite for all frameworks using EF for data access (besides LightSwitch, ASP.NET Dynamic Data application would be also impacted).

In my case I only needed a demo database, so I just found the old good Nordwind database for SQL2000, installed it locally and migrated to SQL Azure instance. Then I could easily create the Data Context in EF5 and implement sample applications in LightSwitch.

So, summing up my experience, as long as you need a sample database running on Azure, you might

  1. Take the SQL Azure version of AdventureWorks here, you might need to exclude some tables from Database Model, however, to get this database working under Entity Framwork 5.
  2. You can use the "old good" Nordwind, download a legacy version for SQL2000 from MS page and migrate it towards SQL Azure.
  3. Use some other sample database, like Chinook and use them. Chinook provides full support for EF5, or, in other words, EF5 supports everything Chinook implements.

Permalink | Comments (0) | Post RSSRSS comment feed

iTunes suddenly fails to start on Windows 8

Yesterday I faced the unexpected problem that my iTunes stopped working. Just three weeks ago it was all fine and I could start it without any problem, and now it does not start at all. Whenever I start it from the start screen, it pops up in the list of processes for about 10-15 seconds and then exits, starting no GUI thread whatsoever.

The solutions that are suggested in numerous support forums are usually to start it in the compatibility mode. This is wrong, please don't do it, because iTunes detects that it has been started in the compatibility mode and might refuse to work because of it!

The correct solution is to uninstall Quicktime. It works like a charm and you don't even have to restart your computer afterwards. Just make sure that you choose to "Uninstall completely" (the leftmost option in uninstaller).

Interestingly enough, the Solution Finder tools from Windows 8 also suggests the compatibility mode, and even that of Windows XP. Please, don't follow this recommendation.


Permalink | Comments (0) | Post RSSRSS comment feed

Approaching Fletchers as if they were Software Developers

I had to write a sample job offer for my English class and decided to approach this task with creativity, so I came up with a job offer for a fletcher instead. Enjoy! 

Senior Fletcher

BowMasters (www.BowMasters.com) is a leading bow manufacturing company. BowMasters offers consulting, bow development and maintenance, BRP and BPO services to the clients across the globe. With its relentless focus on achieving the highest quality in its deliveries, BowMasters has become a trusted brand and achieved consistent growth in its repeat business with its clients.

BowMasters is renowned for its experience in working with the world's best technology companies and some of the world's leading enterprises in the vertical industries. Our commitment to quality is reflected in the fact that all of the world's top five technology companies are currently long-term clients of BowMasters.

BowMasters's is looking for a world-class fletcher who will do design, manufacturing, testing and maintenance of fletching for customers worldwide. 

Our team consists of highly-skilled bowyers, fletchers, and arrow head makers, who are passionate about building the internationalization infrastructure that enables our products to be used by millions of users worldwide. 

Job qualifications:

• The candidate must have a M.Arch. (Master of Archery) degree

• A minimum of 4+ years related manufacturing experience with outstanding track records.

• Expertise in fletching for long- and flatbows, composite and compound bows.

• Knowledge of recurve and laminated bows is a plus.

• Strong skills in bodkin, judo and target points, as well as field points and blunts. 

• Good understanding of military, mounted and field archery;

• Good knowledge of bow strings;

 

Responsibilities:

• Manufacture fletching for new bow models;

• Maintain existing bow models, manage social contacts with customers and control user experience;

• Document activities from design to completion and participate in testing of projects;

 

Location: San Francisco, CA

Compensation: DOE (up to 100k), competitive salary and equity, flexible work hours, stock options, matching 401(k) contribution program, comprehensive health benefits (medical, dental, vision), free lunches and snack, semi-annual bonuses based on company performance, monthly commute allowance

Principals only. No relocation provided.

If you are an extraordinary person who strives for excellence, please submit your resume via pigeon post, pigeon hole #114. BowMasters is proud to be an equal opportunity employer that is committed to a diverse workforce.

 

 


Categories: general | opinion
Permalink | Comments (0) | Post RSSRSS comment feed

Expression Blend crashes on start-up with error in the .NET Runtime at IP 5F791825 (5F790000) with exit code 80131506

There is an incompatibility issue between Expression Blend 4 and .Net 4.5 (currently as a developer preview installed as a part of Visual Studio 11 Preview). 

This issue leads to Expression Blend crash on start-up with the following error message: "The process was terminated due to an internal error in the .NET Runtime at IP 5F791825 (5F790000) with exit code 80131506."

The issue is reported on MS Connect and there is a workaround: download the attached bat-file, store it to a temporal location, right-click on the file and select "Run as Administrator" from the context menu. This solved the problem in my case.

blend.bat (576 bytes)


Permalink | Comments (0) | Post RSSRSS comment feed

Updated version of GLSL parser by Laurent Le Brun to compile with FParsec 0.9.1

GLSL parser by Laurent Le Brun is a gold nugget for those who would like to use FParsec to parse C-like languages. As I have plans to use this script as a starting point for my TouchDevelop parser, this script was the first I looked at after I fixed F# support in my Visual Studio 2010. 

It turned out, however, that the script is somewhat outdated: written in 2010 (according to the date of the blog entry) it requires an earlier version of FParsec library. Trying to compile it against the latest stable version of FParsec (0.9.1) fails due to some deprecated names (mostly abbreviations that were previously used in FParsec, like "Assoc", are now written in full). Besides, the syntax for SyntaxParser generic constructor has been changed to include the UserState type. 

I adapted the code by Laurent Le Brun, the working versions of the script with minimal changes (see below) can be downloaded from this blog entry.

Changes compared to the original version of the script:

  1. Assoc -> Associativity
  2. µOp -> µOperator where µ in {Infix, Prefix, Postfix, Ternary}
  3. Parse -> GlslParser
  4. Ast -> GlslAst
I also put together a small VS2010 project containing the latest version of FParsec along with a sample GLSL script that is automatically parsed if run in debug mode.
Files:

glsl_parser.fs (8 Kb) 

VisualStudio 2010 solution with FParsec 0.9.1 and sample script (1.36 Mb)


Categories: .net | F# | parser
Permalink | Comments (0) | Post RSSRSS comment feed

Using LINQPad as scrapbook for FParsec

FPrasec is a smart implementation of the famous Parsec library from Haskell. 

FParsec belongs to the class of parser combinators, meaning that you don't have any IDE or formal definition of your grammar (as in case of ANTLR/ANTLRWorks). Rather you deal with some primitive parsers which can consume strings, digits and combine them in a clever way to implement your parser. The main advantage here is that you have the full control over what you are designing and can use the full strength of the underlying language (in this case that of F#).

Since I have got problems with F# in my main Visual Studio installation which I couldn't not repair (here is the respective StackOverflow question) I decided to use my favorite tool LINQPad for learning and designing parsers and it worked perfectly.

So, if you want to use LINQPad with FParasec the only thing you have to do is to add the references to FParsec DLL. This is done by going to the application menu:

and then add both FParsec.dll and FparsecCS.dll to the "additional references". You will find these DLLs after the first compilation of the FParsec source code. 

For your convinience I am attaching an achive with pre-compiled DLLs to this post.

From now on you can use FParsec freely, just don't forget to open the FParsec namespace in your code.

I also adapted the examples from the FParsec tutorial to run in LINQPad. Those samples expect the DLLs to be stored under the follownig path: D:\Dev\FParsec\DLL\

You can download the complete tutorial scripts and the samples provided with the FParsec using the link below.

FParsec.dll and FParsecCS.dll (.zip, 176 kb) 

LINQ queries for tutorial and samples (.zip, 13 kb)


Categories: F# | parser
Permalink | Comments (0) | Post RSSRSS comment feed

About Refactoring

I just answered one question on Programmers@SO and would like to post my answer here as well:

Q:

  1. How does [refactoring] takes place in the Software development process and how far it effects the system?
  2. Does Refactoring using these tools really speed up the process of development/maintenance?

A: First of all, depending upon the site of refactoring one can distinguish several types of it: code refactoring, database (schema) refactoring, refactoring of unit tests, refactoring of GUI etc.

There are several situations where you can meet refactoring during software development:

  1. Refactoring is known to be a mandatory step in certain agile development techniques like test-driven development. It is supposed to perform refactoring step after every implementation step. In this case the refactoring targets just the last implementation and its goal is to integrate the new code into the existing code corpus in the most optimal way.

  2. Refactoring can be done some internal problems in the working code are detected: this is called "code smell". This estimation is in many aspects rather subjective, despite the fact that it can be actually based upon certain code metrics (like number of lines of code per method, cyclomatic complexity of the code etc.). Here the goal of refactoring is to improve the code quality by changing it so that the metrics used for quality estimation return to the expected domain.

  3. You often need to refactor the code to achieve certain principles of programming in your code, look for Clean Code development to learn more about such principles.

  4. You may need to perform refactoring of your code and database schema to prepare it for coming changes, especially if those were not considered during the design phase of the project. For example data normalization and denormalization take often place during data-driven software development to prepare the database for possible extensions.

Refactoring tools available on the market basically support the developer in two ways:

  1. While writing your code, you get suggestions how you can improve it "on-the-fly". Whereas many fallacies can be detected directly by your IDE, like Visual Studio or Eclipse (for example dead code, variables declared but not used etc.), the refactoring tools like Resharper can reveal problems which are far less evident, like re-writing the loops in LINQ queries etc.

  2. These tools also support you with custom refactoring steps, like global renaming of your identifiers, splitting your class declarations into separate properly named files, extracting interfaces and base classes from your class implementation etc. They save a lot of work here, especially if your project has a large code base, but you must first know what you really want to refactor.

Actually using tools like ReSharper in everyday's development is so useful that it makes you almost dependent on them: they really accelerate the process of code writing, especially if you know how to use them appropriately!


Permalink | Comments (0) | Post RSSRSS comment feed

Class diagram for CodeDom namespace in .Net

CodeDom namespace in .Net is one of several ways to develop your own compiler or source code generator in .Net. Even somewhat abandoned, it currently support for following languages:

  • C# (native, out-of-the-box),
  • VB.NET (native, out-of-the-box)
  • F# (native, out-of-the-box)
  • IronPython

CodDom provides you with the classes to build your own abstract syntax tree of your code, which can be either compiled to "binary" aka CIL or translated "back" to a high-order general purpose language from the list above. This explains why it is rather a complex namespace with many classes and non-trivial class hierarchy. If you try to build class diagram for this namespace you come up with the following image:

Since it is absolutely impossible to work with diagram I prepared a set of diagrams covering large constellations of classes.

So, here you have:

  1. A general overview of CodeDom namespace classes with "pruned" tree.

  2. Collections and enums.

  3. CodeStatement

  4. CodeType

  5. CodeExpression

     

So, feel free to use these diagrams in your work. There is no explicit license attached to these images (I haven't drawn them, just used the VS2010 designer), so in case of a question just consider my work being a public domain.


Permalink | Comments (0) | Post RSSRSS comment feed

How To: Improve the performance of Visual Studio UML and class diagramm designers

If you feel that Visual Studio 2010 UML or class diagramm editor became slow and not as responsive as it is used to me, especially if you are working with many objects at a time, then you should try to close the property window. Strangely enough this window gets populated every time you select an object and in case if you select many objects it shows only the commonly set properties. This is the reason why the editor becomes so slowly (even though iit does not explain why it takes so long to populate the properties window).


Categories: how-to
Permalink | Comments (0) | Post RSSRSS comment feed

A wonderful collection of bit hacks, interesting for every programmer

Have you ever tried to count bits in a bit-array structure without using shitfts? Or probably used bitwise XOR for a primitive but effective cryptography? Then you will definitely like the following webpage, providing an absolutely wonderful collection of non-trivial bit manipulations for achieving marvelous results. Really a great place to read and to take away some pieces of bit manipulation magic.

 


Permalink | Comments (0) | Post RSSRSS comment feed

Object-Relational Mapping: a handy design pattern or a spoiling anti-pattern?

Today somebody asked the question about the nature of ORM in development: should we consider it a useful pattern or an anti-pattern.

Here are my thoughts on the topic (just copy and pasted from StackOverflow):

Actually ORM helps you to quickly implement a data-base connectivity and implement your application logic without paying much attention to the actual connection to database. You are allowed to use the entities of your programming language while implementing the logic and you don't have to care about how these are then translated into the relational model of database. This is the main advantage for me and that is why ORM is so popular -- you can develop a simple data-driven application in just a couple of hours.

So, ORM, as many other technologies like managed code, garbage collection, generics etc. is optimized for developer productivity, e.g. to minimize the number of developer hours (that are normally quite expensive) needed to implement certain functionality.

As long as you have other criteria that may override the above mentioned one, like performance, application size, flexibility of the logic, network throughput, code size (both of the source and compiled) ORM is not your friend anymore. But since this is not a common scenario people usually don't care and take ORM for their applications.


Tags: ,
Permalink | Comments (0) | Post RSSRSS comment feed

Why C++ is not good as the first programming language

This term I teach the course "Introduction into Programming with C++" for the first-year students of Engineering Sciences. 

The course was requested by the University and targets the students with no prior knowledge of programming. The choice of C++ as the language of the course was not mine, this is the default language to teach OOP at our university (Hamburg University of Technology), for the professor in charge is doing his research primarily using C++. So, the decision to use C++ as the first language was more or less imposed on me by the dean office.

During the very first session I provided students with several motivating examples. One of those was the classical task about the chessboard and the wheat beads, where  one Vizir truly enchanted by the wonderful game of chess asks the inventor for a decent reward and the latter asks him to put one wheat bead on the first field, two beads on the second fields etc. until every field is covered. This seemingly trivial task results in a very high number which is not so easy to compute directly using a brute force approach (without deriving the summation formula for this geometric row).

So, we implemented this task in several languages. Here is the one-line implementation in F# (MS version of oCaml for .Net):

let rice = 
     [0 .. 63] |> List.map (fun x -> 2I ** x)  |> List.sum |> Dump

The equivalent C# (and with small changes C++) code is much longer and requires much more things to keep an eye on:

void Main()
{
	ulong sum = 0;
	for(int i = 0;i < 64; i++) 
	{
	  sum+= power2(i);
	  sum.Dump();
	}
	sum.Dump();
}

// Define other methods and classes here
ulong power2(int power)
{
   ulong answer = 1;
   for(int i=0; i < power; i++) answer=answer * 2L;
   return answer;
}

This is why I do agree with the widely spread opinion that Computer Science and especially the algorithms should be first taught using a non-imperative language, like Haskell, oCaml, Erlang, Scala or F#.


Permalink | Comments (0) | Post RSSRSS comment feed

I will be at TechEd 2010 Europe in Berlin

Hi, everyone!

It has been quite some time since I wrote something in my blog, mostly because I was busy with my studies.

But today's news could not evade my blog: I will be attending TechEd 2010 in Berlin this year.

I will be working there as an invited MS Expert at Silverlight Booth.

So, if you happen to be there as well, come to our booth!


Categories: general
Permalink | Comments (0) | Post RSSRSS comment feed

The Zen of live coding: Win7 Zoom feature and ZoomIt from Sysinternals!

If you have ever presented something to the developer audience you have definitely had to show some features or code samples live. I often combine live coding with slide presentations, not only because it makes people wake up, but also because it gives your presentation a professional and lively touch, helping you to win the audience in case if the topic does not seem to be extremely interesting to the majority of them.

During these live shows I often feel the need to zoom to certain area of my screen in order to emphasize the actions I am taking or the code menu items I choose. The easiest way to achieve it if you present on a foreign laptop is to use the new Win7 zoom feature. By holding Winkey (the key located between Ctrl and Alt on the left-hand side of keyboard) and pressing "+" and "-" on the additional keyboard (known as "Grey Plus" and "Grey Minus") you can zoom in and out respectively.

There are several zoom levels you can reach if you press these buttons several times, enabling you to focus on the tiny little part of your screen. Your system remains fully responsive during and after zoom, you can just continue typing or choosing menu item, the magnifier would normally follow the mouse cursor (this feature is called "live zoom") unless explicitly set otherwise. If you want to leave the zoom modus you have to left-click on the magnifying glass and close the magnifier panel.

This built-in option equips you with a very handy presentation tool working right out-of-the-box. However, you might sometimes need a little bit more than just zooming. During my live presentations I often have the situation where I need to freeze the screen content for a while and to explain something in more details. I was desperately looking for a free solution which would assist me here and found a very nice tool from Systeinternals webpage called ZoomIt.

This is a small single executable (about 500 Kb) which has to be started manually and which resides in your tray after start watching for the keystrokes. The default key-mappings (Ctrl + 1..4) collide with keyboard layout switch in my system that is why I changed those, in your case you may wish to do the same, you just have to right-click on the magnifying glass icon in the tray and choose "Options":

As you can already guess the from context menu, there are basically three different functions you can run by pressing keystrokes or from this context menu.

  1. Zoom

    Zoom works similarly to the Win7 zoom feature, except for the fact that there is no more mouse pointer visible; the screen zoom, however, follows your mouse movements until the first mouse click, which unveils the reason for such a strange behavior. Basically, zoom mode captures the screenshot of your desktop and you zoom to the still non-interactive image. Being in this modus you have two options: by clicking the right mouse button you fall back to the normal screen and can continue working, by clicking the left mouse button you enter the drawing mode and can now draw with your mouse pointer, depicted as a small cross.

    By clicking the right mouse button you can go one step back and continue to zoom throughout the screenshot of your desktop. You can go back to the normal working mode either by clicking right mouse button twice or just by pressing Esc in the draw zoom mode. There are multiple keystrokes you can use in the drawing mode; you get the full description if you go to the "Draw" tab of the "Options"-Form.

    Strangely enough, there is however one keystroke which is missing in this very detailed description (you'll however find this keystroke on the adjacent tab). If you press "t" in the drawing mode you can enter any text (it will be printed using the color of you pen) and finish the text entry by pressing Esc. This helps you to annotate the screenshots quickly.

     

  2. Draw

    Draw mode is practically the same we were describing just above, the only difference is that you don't need to zoom first and you the screenshot of your complete desktop as a template for your artistic exercises.

  3. Timer

    This is a nice option if you deliver talk or workshop where you have some assignments (like implementing "Hello, World!"). In order to dim down your presentation or IDE and to stress the importance of the exercise you can start the stop clock which would countdown the time (default for 10 minutes) until the deadline.

  4. LiveZoom

    Working only on the systems starting from Vista on this mode completely mimics the zoom feature of Win7, e.g. you have a completely functional desktop you zoom at and you can continue working (typing) whereas the part of your desktop is shown magnified. Even though it may be also handy to use it as alternative to the built-in feature, I feel myself often trapped while using this feature because it is not evident how to live this mode (since the Escape button does not help you). Just press the keystroke once more to go back.

In conclusion I can just recommend you to have ZoomIt on your memory stick together with your PowerPoint presentation, so you can start this handy tool every time you need to present something (there is no need for elevated privileges to start it). With some practice you will be able to produce rather complex annotations to your desktop using this tool only!


Categories: english | sysinternals
Permalink | Comments (0) | Post RSSRSS comment feed

How to disable password request on logon in Windows 2000 – Windows7?

Password protection is essential part of the security policy and you should consider disabling it as the uttermost measure. The possible scenario includes home machine with no domain identification used only at home by family members. Do not disable the password request on your laptop under any circumstances, you may probably want to configure your laptop not to re-request your password on wake-up, but it is essential for keeping your private data secured against undesired access in case of laptop being lost or stolen.

So, in order to disable logon password request you have to do the following:

  1. Download the Autologon utility from sysinternals collection from here:
    http://technet.microsoft.com/de-de/sysinternals/bb963905.aspx
  2. Start the tool with elevated privileges:
  3. Confirm the start with elevated privileges (using safe desktop if you have UAC activated on Vista/Win7).
  4. Accept the license agreement.
  5. Enter your computer name for logon as a local user or domain name for domain logon.
    Then enter your username (local or domain) and password (shown as asterisks).
  6. Click "Enable" to enable autologon.

From now on your system will logon automatically.

In order to disable autologon in the future you have to either change the password of your account or re-start the tool and click "Disable". A message window will prompt the new status of autologon (enabled or disabled).

This solution was tested on Windows XP Professional and Windows 7 Ultimate, reportedly it also works for Windows 2000.


Categories: sysinternals | english
Permalink | Comments (0) | Post RSSRSS comment feed

Technorati relauchned! How to make you blog discoverable.

After reviving my blog with the sysinternal entry I went to Google Analytics to check what kind of users visit this blog. To my surprise there are more people looking for my German CV than any from the IT-area. That made me think it would be nice to get my blog registered somewhere like Technorati in order to get some more target group visitors.

Since relaunch of Technorati there is no more need to add Technorati tags to your every post, the site implemented the architecture known in the IT-world as "inversion of control", meaning that rather than submitting your posts with certain tags you just post normally and the system tries to discover your blog posts and to put them into respective category.

In order to make you blog "discoverable" you have to sign up (or sign in, depending upon your current status at Technorati) and claim your own blog. After filling up all necessary fields as to the URL and feed URL you are requested to put a code (PBFE7PQ3VWN9) provided by the page into the most recent blog entry in order to prove you are the owner of the blog and verify your blog.

As soon as Technorati crawler verifies the token your blog is scheduled for review which may take some time.


Permalink | Comments (0) | Post RSSRSS comment feed

How to measure the execution time of your program in Windows?

There are often cases that you need to measure the execution time of your program. Working in Windows or Linux one gets used to the "time" system utility, which allows one to measure both system and kernel time of any executable just by preceding it with complete parameter set in the command line (for instance "time du").

Pity, there is no such a handy routine in (usual) Windows installation and the "timeit.exe" solution which one might find on many Internet forums does not seem to work in all cases (at least it does not in my case of Windows 7 Ultimate). There is, however, a nice workaround for this problem, provided you want to measure the execution time of one or few runs, by using Process Explorer from Sysinternals toolset by Mark Russinovich. This solution, however, does not help in case you need to perform multiple performance run tests of your software. Please, refer to Visual Studio Profiler in this case.

  1. So, first you need to download the latest version of Process Explorer from here:
    http://technet.microsoft.com/en-us/sysinternals/bb896653.aspx
       
  2. Once you downloaded it and agreed to the user agreement you can see the main screen:

  3. You first should add the parameters you'd like to get from your program to the list of visible columns. For this purpose go to View and choose Select Columns

  4. Go to the "Process Performance" tab



    and select the counters you'd like to see in the general process list. In order to get the total execution time you'll need to tick the "CPU Time" option:



    Click OK to go back to the main screen and you will see the new column CPU Time to the very right of the process table. The order of columns can be easily changed just clicking on the column caption and dragging it to the desired position.

    Note: You might need to enlarge your Process Explorer window to see the new column.
  5. Now you are almost ready to capture the information you need. However if you now start your process and immediately switch back to the Process Explorer, find it in the list of running processes and wait until it terminates you can read the total execution time in the CPU Time column. It works fine in many cases, but you may encounter two types of problems:
    1. If the process you start just runs shortly you may miss your process, for it is already finished by the time you have switched back to Process Explorer and found it in the list.
    2. Even if you managed to get your process, the information about total runtime is quickly purged from the process list.
  6. To solve these problems let us increase the time process is shown in the list after it has terminated. For this purpose go to Options -> Difference Highlight Duration and set the duration to 9 seconds:



    Now the process will remain in the list for 9 seconds after its termination.



    That should be enough to write down its execution time. Now just start your program and read the execution time after its termination. You can easily identify the process which terminated shortly by the red background highlighting (on the screenshot you can see the "pan.exe", the compiled verifier from Spin).


Categories: general | sysinternals
Permalink | Comments (0) | Post RSSRSS comment feed