April 16, 2014

Extending Splunk’s search language with custom search commands.

I recently blogged on a feature I worked on, custom search commands for Splunk using our Python SDK. Custom search allows you to extend Splunk’s search language with new commands that can do things like apply custom filtering, perform complex … Continue reading 

April 10, 2014

ReSharper and Roslyn: Q&A

As you probably know based on news from BUILD Windows conference last week, Roslyn, the new .NET Compiler Platform with code analysis APIs for C# and VB.NET, has been pushed to Preview phase and open sourced. In case the event slipped your attention, here’s a nice tour of Roslyn on the C# FAQ blog at MSDN.

Immediately we at JetBrains were faced with multiple questions on the perspectives of ReSharper using Roslyn for its code analysis and how the two tools might compete. The flow of questions wouldn’t end, to the point of introducing a template to answer them:

ReSharper and Roslyn? I dare you!!

Seriously though, it was clear that we needed to elaborate on the ReSharper vs Roslyn issue. Hence this post.

We sat down with Sergey Shkredov (@serjic), ReSharper Project Lead and .NET Tools Department Lead at JetBrains, and Alex Shvedov (@controlflow), a Senior Developer on the ReSharper team who’s responsible for ReSharper’s Generate functionality, code annotations and support for XML-based languages. The following Q&A is a summary of the conversation that we had with them.

What’s JetBrains stance towards Roslyn? Do we consider the technology and its Open Source status important and valuable?

Roslyn is definitely important and a good step forward for Microsoft in that it should help Visual Studio users take advantage of more C# and VB.NET code editing and analysis features in Visual Studio out of the box.

It should also help Visual Studio extension developers write code-centric extensions against a consistent API while having the opportunity to know how it works inside, thanks to the Open Source status of the project. This is not to mention hackers who are willing to spend their time forking the compiler and tuning it to make, say, C# the ideal language they’ve always envisioned.

We also believe that Roslyn is no less important for Microsoft itself. Faced with the burden of maintaining a plethora of Visual Studio integrated tools including code editing tools, IntelliTrace and code designers, the folks at Microsoft are interested in making these tools as flexible and easy to update as possible. Roslyn should enable updating .NET languages and experimenting with them faster than before. Apart from that, the old compiler wouldn’t let launch compilation steps in parallel, and Roslyn is expected to enable that, bringing more scalability to the table.

What’s the point of making Roslyn Open Source?

As to the act of letting Roslyn go Open Source, we don’t believe that Microsoft is expecting anyone from outside of the company to develop the compiler for them. Programming languages are entities too monolithic and complex to justify accepting external changes of any significance. Therefore, we expect Microsoft to keep the function of designing .NET languages totally to itself without depending on the community.

The true value of Roslyn going Open Source lies in enabling extension developers to look into Roslyn code that is relevant to their purposes: how it’s written and whether it’s efficient. They might debug or profile it to see if it’s the culprit of unexpected behavior in their extensions or if it introduces performance issues. This is possibly the workflow whereby meaningful pull requests might start coming in to the Roslyn repository.

As to possible endeavors to fork and modify the compiler to address application- or domain-specific tasks, this scenario appears to be a shot in the foot. Even if the default compiler in Visual Studio can be replaced with a fork, instrumental support for the fork ends as soon as you go beyond Visual Studio. In theory we can imagine a custom INotifyPropertyChanged implementation based on a Roslyn fork that can even gain certain popularity. However, we can barely imagine supporting it in ReSharper as our intention is to focus on supporting the official version of Roslyn.

Will ReSharper take advantage of Roslyn?

The short answer to this tremendously popular question is, no, ReSharper will not use Roslyn. There are at least two major reasons behind this.

The first reason is the effort it would take, in terms of rewriting, testing and stabilizing. We’ve been developing and evolving ReSharper for 10 years, and we have a very successful platform for implementing our inspections and refactorings. In many ways, Roslyn is very similar to the model we already have for ReSharper: we build abstract syntax trees of the code and create a semantic model for type resolution which we use to implement the many inspections and refactorings. Replacing that much code would take an enormous amount of time, and risk destabilizing currently working code. We’d rather concentrate on the functionality we want to add or optimize, rather than spend the next release cycle reimplementing what we’ve already got working.

The second reason is architectural. Many things that ReSharper does cannot be supported with Roslyn, as they’re too dependent on concepts in our own code model. Examples of these features include Solution-Wide Error Analysis, code inspections requiring fast lookup of inheritors, and code inspections that require having the “big picture” such as finding unused public classes. In cases where Roslyn does provide suitable core APIs, they don’t provide the benefit of having years of optimization behind them: say, finding all derived types of a given type in Roslyn implies enumerating through all classes and checking whether each of them is derived. On the ReSharper side, this functionality belongs to the core and is highly optimized.

The code model underlying ReSharper features is conceptually different from Roslyn’s code model. This is highlighted by drastically different approaches to processing and updating syntax trees. In contrast to ReSharper, Roslyn syntax trees are immutable, meaning that a new tree is built for every change.

Another core difference is that Roslyn covers exactly two languages, C# and VB.NET, whereas ReSharper architecture is multilingual, supporting cross-language references and non-trivial language mixtures such as Razor. Moreover, ReSharper provides an internal feature framework that streamlines consistent feature coverage for each new supported language. This is something that Roslyn doesn’t have by definition.

Will it be practical to use both ReSharper and Roslyn-based functionality in Visual Studio?

This is a tricky problem as it’s still uncertain whether we would be able to disable Roslyn-based features (such as refactorings or error highlighting) when integrating into new releases of Visual Studio. If we’re unable to do that, performance would take a hit. Apart from ReSharper’s own immanent memory and performance impact, Roslyn’s immutable code model would increase memory traffic, which would in turn lead to more frequent garbage collection, negatively impacting performance.

We’re hopeful that this problem would be solved in favor of letting us disable Roslyn features that ReSharper overrides, because otherwise ReSharper would have to work in a highly resource-restricted environment. Irrelevant of whether this happens though, we’ll keep doing what we can do, minimizing ReSharper’s own performance impact.

As Roslyn is now Open Source, which parts of its code are going to be of particular interest to ReSharper developers?

We’ll be sure to peek into Roslyn code and tests from time to time, to see how C# and VB.NET language features are implemented. We don’t rule out that actual code supporting them is going to emerge before formal specifications are finalized. In fact, we’ve already started.


That’s more or less the picture of living in the Roslyn world as we see it today. As time goes by, we’ll see if things turn out the way we expected them to.

Meanwhile, if you have questions that were not addressed in this post, please ask in comments and we’ll try to come up with meaningful answers.

April 09, 2014

Introducing dotPeek 1.2 Early Access Program

It has been a while since dotPeek, our free .NET decompiler, received its latest update, but that doesn’t mean we put it aside. Today we’re ready to launch the dotPeek 1.2 Early Access Program that introduces a substantial set of new features.

Starting from version 1.2 dotPeek learns to perform as a symbol server and supply Visual Studio debugger with the information required to debug assembly code. This can be most useful when debugging a project that references an assembly from an external class library.

dotPeek listens for requests from Visual Studio debugger, generates PDB files and source files for the requested assemblies on demand, and returns them back to the debugger. dotPeek provides several options to choose exactly which assemblies you want it to generate symbol files for.

Symbol server options in dotPeek 1.2 EAP

To learn more on how to set up dotPeek as a symbol server and use it for debugging in Visual Studio, please refer to this guide.

In case that the Visual Studio cache already contains PDB files for certain assemblies but you would like to replace them with PDB files generated by dotPeek, use the option to generate PDB files manually. In order to do that, simply select an assembly in dotPeek’s Assembly Explorer, right-click it and choose Generate PDB.

Generate pdb in dotPeek 1.2

dotPeek can export to project and generate PDB files in the background, meaning that you can explore assemblies during PDB generation or assembly export. To address cases when it’s not clear whether PDB files were generated properly, dotPeek has a dedicated tool window that shows current status and results of PDB generation.

PDB generation status in dotPeek 1.2 EAP

In addition to the set of features that streamline debugging decompiled code, dotPeek 1.2 adds quick search and node filtering in various trees, most notably Assembly Explorer. Searching and filtering using lowerCamelHumps is supported for these scenarios.

Search in Assembly Explorer in dotPeek 1.2 EAP

If you’re interested to learn about other fixes and improvements made for dotPeek 1.2 EAP, this link should help you out.

Does the above sound enticing? Download dotPeek 1.2 EAP and give it a try!

April 03, 2014

Always Backup. Always

Last night I learned (or I should say re-learned) a hard lesson. Several lessons, actually. More on that in a moment.

What Not to Do

I built a VM using Hyper-V to have an isolated environment for client work. Stored the VM and its .vhdx file on an external drive. So far, so good.

But for performance, I figured it'd make sense to set the VM up for boot to VHD. Did I mention the VM was installed on an external disk?

Attached the VHD, made sure to copy over the boot files, and used msconfig to add it to the boot order. msconfig warned me that I might need to enter my recovery key on the next boot, but I figured "no problem, I'll just pull up http://onedrive.live.com/recoverykey and have it ready to go."

Well, it turns out that if you've deleted your recovery partitions to save space (which I have), you may not have the opportunity to enter a recovery key during the boot process. And the bootloader apparently couldn't see my external drive (I'm guessing because it had not yet been properly initialized at that point in the boot process), so I couldn't boot from the VHD. And when I tried to boot to my base OS, I got a message that said there were "no recovery options" on my PC, presumably because I'd deleted the recovery partitions earlier.

I was able to get my recovery USB drive (yes, I had one) to boot, but startup repair wasn't able to fix the problem, nor did using a restore point from earlier in the day.YUNoBackup

In short, I made rather a mess of my machine, and today I'm slowly reinstalling applications after using "Refresh my PC" to get back to a state where I can at least boot the machine.

Lessons learned:

  1. Any time you're making significant changes to your machine, create a full image backup FIRST. If I'd had that, restoring to a good state would have been easy.
  2. If you decide to remove recovery partitions to save space, be aware that some recovery scenarios may not work, particularly if you don't use the recommended process. For a variety of reasons, I wasn't able to use the recommended means of copying the recovery partition(s) to USB before deleting them, so while I had a recovery drive, it may not have had all the necessary stuff to properly recover.
  3. Did I mention back up before significant changes?
  4. Don't try to multi-task. Humans can't do it. What many people call multi-tasking is actually switch tasking, and it's inefficient at best. If you're doing something that requires your full attention (and anything that can trash your machine counts), give it your full attention.
  5. While Windows 8/8.1's recovery options are quite robust, it's still possible to put your machine in a state that the recovery tools can't handle. Plan for that. Back your PC up. Frequently. Where possible, I like to use the built-in System Image Backup. Alas, that's pretty hidden in Windows 8.1, but you can find it tucked away in the corner of the File History dialog in the Control Panel, as shown below:SystemImageBackup
  6. It's all to easy to be complacent when dealing with computer environments. Don't be. If your livelihood depends on it, back it up regularly, and have a recovery plan if something fails.
  7. Recovery, even when things aren't perfect, is easier than ever, thanks to the cloud. Because most of my data is in the cloud, connecting back to the relevant cloud service is sufficient to get my data where and when I need it. Reinstalling applications is a pain, but ultimately more manageable than recovering lost local data.

Conclusion

If I haven't been clear enough already, the moral of this story is simple. Backup. Whether you use System Image Backup, or a third party utility that does something similar, whenever you get a machine to the point that it's configured the way you like it, do a full image backup, so that you can easily get back to the point in a pinch. I used to run a Windows Home Server that automatically backup up every PC in the house, and I used those backups on a couple of occasions to restore a machine with issues. But unfortunately, WHS doesn't seem to have a future, and so once my WHS machine died (power supply, I think), I didn't bother to build a new one. Rethinking that…perhaps time to look at Windows Server 2012 Essentials.

But however you may choose to go about it, backup frequently, and don't be complacent about making significant changes to your PC(s). Unless you'd like to learn the hard way, too.

Your Turn

Got some tips or best practices on backup and recovery? Let me know in the comments!

April 01, 2014

Boom Or Bust!

This post is for anyone who podcasts, videocasts, or otherwise relies for their living (or hobby) on recording their voice on their computer. It's particularly addressed to folks like myself who record screencasts, in which you're teaching people to use software, oftentimes including demos in which you're typing live while recording your screencast.

Get a Boom!

Last year, I wrote a primer on audio gear for podcasters, as well as a follow up with some additional recommendations. In the first of those posts, I mentioned that I use a RODE PSA1 boom arm for my microphone. This is possibly one of the most important pieces of audio gear I own, even though there isn't a single bit of electronics in it.

Why? Two words.

Audio Quality.

A Boom Can Help Your Sound

The motivation for this post is the fact that I was listening to a video tutorial (I won't share where, or who authored it, as that's not really the point). From the sound quality of the video, it sounds as though the author is using either a built-in mic on their laptop, or perhaps an inexpensive USB mic on a desktop stand. There's a fair amount of ambient echo, which is typical for rooms that haven't been acoustically treated, and which is usually perfectly fine.

Noise, Noise, Noise

What's not fine is hearing repeated thudding each time the author hits their desk. And the thud-thud-thud that comes with every keystroke during the demos. Understand, the point is not to knock the author of this course. It's to point out that there's a very easy way to avoid this…a boom.Photo from Oregon State University digital collection
OK, so this model probably isn't sold anymore…

A microphone boom arm helps isolate your microphone from sources of noise, including inadvertent taps and bangs on your desk, as well as keyboard noise transmitted through the desk. You can get some of this benefit from a shock or spider mount, but a boom does the best job at isolating vibrations, which is why they're used by radio professionals, who rely on good sound for their living.

Signal is King

The other big thing that a boom mic can do is improve the signal (i.e. your voice) by allowing you to place the mic closer to the source, namely you. With a boom mic, you can place the mic within a few inches of your face, which will ensure that what you record contains more of your voice, and less of whatever else is going on (echos, outside noises, etc.). If you don't have a pop filter, just speak slightly off-axis to the mic (turn slightly to the left or right), and you should be able to get a great signal.

Is It Worth the Cost?

The boom I use costs right around $90. If you podcast/videocast as a hobby, that may be more than you'd like to spend. In that case, I'd recommend looking G-Trackinto cheaper solutions, like a spider mount (and to be clear, a spider mount is a must even if you do use a boom). But if you get paid for recording your voice, a boom is a seriously worthwhile expenditure. If your audio has problems, most people won't tell you that directly. They may not even notice it consciously. But they'll probably stop listening.

And in addition to the improvements in audio quality a boom can bring, it also adds convenience. Having my mic on a boom means that there's one less thing cluttering my desk. When I'm done recording a given podcast or video, I just swivel it up out of the way.

Your Turn

I'd love to hear from other podcasters and videocasters about any tips you have for getting the most out of your gear. Drop a comment below, or feel free to use my contact form.

January 05, 2013

Speaking at SQL Saturday #184 (North Haven CT) 03/02/2013

I am presenting at SQL Saturday #184 event in North Haven CT on March 2nd 2013. The presentation topic is about PowerShell and SQL Server – ‘Get-PowerShell | Get-SQLServer’.

Primarily aimed at SQL Server DBAs & developers – it’ll introduce PowerShell and show how it is applied with SQL Server and related tasks. More details are here at the SQL Saturday #184 session site.

January 01, 2013

Microsoft MVP Award for 2013: SharePoint Server

MVP-Rect1

My thanks to Microsoft for the Most Valuable Professional (MVP) Award for 2013 - 6th time in a row since 2008!  The award is based on SharePoint Server and working with the CTDOTNET/CTSPUG developer groups and also actively presenting and participating in numerous events – details are the MVP Profile site.

More about plans that are in the works for this year is coming shortly - some exciting stuff for the community!

April 28, 2011

Siemens Smart Grid Innovation Contest

1The Smart Grid Innovation Contest is an open international competition to find new, sustainable Smart Grid business models and technologies for the near future.

Siemens believes in the future of the Smart Grid for a more sustainable world – a vision of intelligent, flexibly controllable power generation, distribution, and consumption. The breakthrough of Smart Grid applications, though, strongly depends on attractive business models that combine technologies and economic benefits.
The Smart Grid Innovation Contest consists of two phases: during the first phase, ideas are generated and developed in a collaborative community. In the second phase, universities are invited to submit research proposals to further elaborate and develop ideas.

Idea contest (for everyone) from April 13 to May 31 2011
 
Call for proposals (for universities) from October 4 to November 30, 2011. Siemens will award €15,000 and a workshop trip to Berlin together with Siemens Smart Grid experts to the five best ideas and the most valuable contributions. In a joint effort with several universities, more than €1,000,000 will be invested to translate the participants’ ideas into innovation. The contest addresses your creativity and your local expertise in making energy systems smarter and more environmentally friendly.

Watch an idea grow, through suggestions, comments and ranking, into mature and realistic innovation!

Full competition details and rules at:
http://www.siemens.com/smartgridcontest

September 22, 2010

Build a Quadrocopter using .NET Micro Framework and win a VS2010 + MSDN Subscription

For those of you are interested or working on a Quadrocopter controlled by the .NET Micro Framework, there is a contest where the winner of a flying Quadrocopter will get a free VS2010 license including 1 year MSDN subscription.

Check my blog at http://netmicroframework.blogspot.com/

September 28, 2008

IndiaStockQuotes Version 1.2.1

It has been a long time since I actually worked on the IndiaStockQuotes component. Just had sometime over the weekend and fixed some bugs in the component and got out a new release. Also upgraded the component from .NET 2.0 to 3.5. I dont yet use any 3.5 specific features, so you should be able to recompile the source agains 2.0 and still get it to run.

Check it out at India Stock Quotes

September 26, 2008

Moving a project from VS 2005 to VS 2008

When you open a VS 2005 project in VS 2008, Visual Studio offers to migrate the project to the new format. Usually there should be no problem with this and all your project files, solution files, Test cases etc should move seamlessly to the 2008 format.
Targettedframeworksetting
But if you do build your project you will notice that your output assemblies actually target .NET Framework 2.0 and not 3.5. This is basically because the migration retains the targeted framework to make sure you application does not fail. The method to change this setting after migration is not easy to find.

For VB projects, this setting is actually hidden inside, My Project -> Compile -> Advanced Compiler Options dialog. Obviously, this is not very easy to find. (See Image)

In C# projects this setting is a lot easier to find in Project Properties -> Application Tab itself. I am not sure why the VB team actually made this setting so difficult to find.