Yesterday Microsoft started introducing Visual Studio 2010 to Windows
developers with a
press release and a
MSDN website. Introductions to the next Visual Studio also popped
up on various technology news sites; InformationWeek,
ChannelWeb, Microsoft
Watch, BetaNews,
and Ars
Technica each have brief summary and explaination of the
information Microsoft has released so far. Only NetworkWorld
digs into the subject by asking various developers to give their
impressions of the new Visual Studio.
All of the attention is being focused on Visual Studio Team System 2010 and its emphasis on “riding the next-generation platform wave, inspiring developer delight, powering breakthrough departmental applications, enabling emerging trends such as cloud computing, and democratizing application life-cycle management.” The MSDN website fleshes out each feature category more with a small blurb about each one, but only fleshes out application life-cycle management.
The application life-cycle management category is the one being talked about right now. Basically, it is about pulling modeling, testing, and collaboration closer together. For instance, Visual Studio will automatically generate a list of tests to run against the code. For the testers, a feature dubbed “Tivo for debugging” has been developed. The feature will record a testing session which can be appended to a bug report. The developer can then playback the session and hopefully track down the bug. The affects are two tiered. The developer will get information to help squash a bug, and the tester will not have to try to convince the developer that the bug exists.
UML (Unified Modeling Language) has been added to Visual Studio in the new version, and it is the basis for the modeling tools in the new Visual Studio version. The idea is that by allowing programmers access to the design tools, and tools that expose the design, it will help them visualize the program. A new tool, the “architecture explorer”, will display the relations and dependancies in a giver piece of code in a nicely visual format.
There are hints about what the other categories mean scattered through out the articles. Feature requests from developers have been added, which will hopefully increase their delight. Apparently, many of the features were built with agile programming in mind, so the new features and tools should help facilitate that development style.
The next generation platform and emerging trends are two interesting categories. Visual Studio 2010 focusing on Windows 7 is a given, but what isn’t a given is what other platforms and emerging trends Visual Studio is going to be focused on. GPU programming and parallelization, for example, are two areas where programmers could use some help. Microsoft also announced the .NET 4.0 framework along with Visual Studio, but they have been quiet about features and details so far. The new .NET framework could be one platform they are targeting. Then there is the Microsoft Watch article which contains a quote by, Microsoft CEO, Steve Ballmer which indicates a cloud centric operating system will be unveiled at Microsoft’s Professional Developers Conference.
The powering “breakthrough departmental applications” category is fairly nebulous. It seems to translate to, “You can still code programs in Visual Studio 2010. You won’t have to buy a separate program “
Is this one compatible with all the popular viruses or am I going to have to hack them to get them to run on XP?
I dont know if MSDN has editors which can authorise / censor releases, but that webpage was so full of marketing double-speak that it’s killing my brain. Every time I read one of those new-age marketing words, my brain automatically replaces those terms with bullshit. Even then, the article doesn’t make any sense.
And I’m a professional software engineer with over 12 years experience.
“Microsoft described the next release through the following five focus areas: riding the next-generation platform wave, inspiring developer delight, powering breakthrough departmental applications, enabling emerging trends such as cloud computing, and democratizing application life-cycle management (ALM).”
The PR office probably celebrated after writing this.
You don’t have to justify your comment with your experience. What you said was entirely true, even to those who know nothing about it.
First, I suggest anyone reading the article just skip the bullet points. It sounds like it was run through the buzz word generator. http://pdos.csail.mit.edu/scigen/
On the whole it is very light on real content. It looks like they added a UML graphing utility that the pointy haired boss can fiddle with. It says that it is to ‘enable users’, but the users aren’t going to care. Most of the time programming is just black magic as far as they care.
There is also a debugging tool that is meant to capture enough system and application state from the testers to get all the information to reproduce bugs. Eliminating the “No-repro” bugs, in the parlance they just made up. This one might actually be pretty cool, as long as you have enough of a bull shit detector to realize that it probably won’t eliminate non-reproducible bugs like they claim.
No automated piece of software will ever capture as much useful information as a good QA department.
That’s true, but such tools can go a long way towards making software more robust. If you take that approach, do you not bother with unit tests? After all, the testing department will catch the bugs anyway.
I agree… nothing is more useful (for lower layer software,to be sure) than a set of great assertions so that the debug builds of the code test themselves to the greatest extent possible. And of course a set of unit tests to drive the system through at least the mainstream cases.
Unit tests are as important as anything else. There are a series of tests that need to go into making quality software. If any step of the process is skipped, your standard of quality goes down. The more complex the system you are creating is, the more important this is.
It starts with unit tests, which each test a small atomic unit of work and run very fast. These should be all pass in a module the developer is working on at the least before every check in. This is a safety net for regressions due to bug fixes, and also increases confidence for refactorings. If you need to change the way something works, through unit tests you have immediate feedback on how those change impact the rest of the system.
Check ins should trigger automatic builds, and those builds should trigger integration tests. Integration tests are more end to end, and will interact with outside infrastructure (like databases, web services, etc).
There should be nightlies that go out to QA. QA runs scenario tests where they methodically hammer at the software through the interface. The scenarios they use should be written against the specs.
There should also be performance tests at the end of an iteration, (2-4 weeks) so that you are able to track the performance impact recent changes have had on the system.
Keep in mind I am an enterprise guy passionate about agile in general, and SCRUM in particular, so if I talk strongly about testing that is why. But I strongly believe there is no way to create quality software without the right kinds, and the right amount of testing.
Unit testing should always be done by the developer IMO. Most software practices follow that rule now.
The Networld link should be:
http://www.informationweek.com/news/windows/operatingsystems/showAr…
Actually, the “networkworld” link shouldn’t point to InformationWeek, which is a competing publication.
And NetworkWorld picked up my article from CIO.com (as we’re sister publications). The original link is http://www.cio.com/article/451622
Crap, I screwed the links up. Sorry about that.
Here is the http://www.networkworld.com/news/2008/092908-developers-respond-to-…
Along with http://www.informationweek.com/news/windows/operatingsystems/showAr… CRN” rel=”nofollow”>http://www.crn.com/software/210604526″>CRN, http://www.microsoft-watch.com/content/developer/what_does_visual_s… Betanews” rel=”nofollow”>http://www.betanews.com/article/Microsoft_shares_early_videos_scree… , and http://arstechnica.com/journals/microsoft.ars/2008/09/30/microsoft-… .
Watch out, Microsoft!
the use of “2010” is copyrighted by the IOC!
(c.f. http://yro.slashdot.org/article.pl?sid=08/09/30/2257234&from=rss )
But its only 2008! What will they release in 2010?
this really is a look ahead, VS2008 hasn’t been out too long (feb 08).
Although i don’t program in .net (im a delphi fan myself) ive always thought that the VS IDE is one of the best around.
Jesus, “Democratizing Application Lifecycle Management” ? Give me a break with the buzzwords already…
Why do I still need VS 2005 to compile things like Google Chrome when 2008 is out?
Hopefully, VS 2010 will address this issue.
Wow, my BS-o-meter just asploded.
.NET to me has been one of the biggest scams, at least for me. When our projects were made with Delphi we could have the solution in a couple of monts, with .NET it take us from 6 to 8 months, but there is no way you can make a hyped IT change his mind.
This is my personal opinion, if .NET has worked for you then Im glad, but honestly it has been overrated, after measuring the results, that’s my conclution.
I’m not a big fan of Microsoft, but I do give them credit for good development tools. Also, I have found .NET to be a very productive language for the following reasons:
1. Fully object oriented – for those who came from the VB6 world, this is quite a step up. No longer do the VB guys have to talk about their “Object-based” language. VB is now fully in the OO world. However, this can be a bummer for those VB6 folks who never really “got” the OO concept, and were just cranking out piles of spaghetti code.
2. Multi-language projects – the divide between the VC and VB aisles dramatically narrowed. Before, the pocket-protector crowd gravitated to the Visual C++ side of the house, and the folks who had actually been on dates were on the VB6 side. Now, if they were smart, the VC guys have moved on to C#, and the VB guys can now fully cooperate on the same projects. Heck, even you Delphi.NET folks are welcome!
3. Desktop vs. web – the gap between the web folks and the desktop crowd also was narrowed. Using .NET, the desktop and web paradigms are much more similar than in the past. Most desktop developers have little trouble adjusting to the ASP.NET world.
As for your team taking that much longer to do .NET projects, I imagine that is just the adjustment to new languages. I would have recommended Delphi.NET to better resuse your skills.
As for me, Go Mono!
Vim,
Bram Moolenaar to release it’s next generation auto documenting multi-platform text editor. Vim, the de facto split-window-capable text editors new strengths come from refocussing it’s attention at the four pillars of new modern technologies:
– Time-phased sharing of Web 2.0 standards.
– Rationalising of business integration management.
– Multicapable future interface features.
– Maintaining organising enterprise integration.
This all, together with Vim’s new and powerful logic cloud based editing focus shift is sure to pave the road to your company’s future.
Vim, laying the foundation of your enterprises embracement of the future!
You forgot Synergistic Service-oriented Search and Multi-paradigm Global Infrastructure Supply Chain Replacement.
%s/truth/buzzwords/g
1. Fully object oriented – for those who came from the VB6 world, this is quite a step up. No longer do the VB guys have to talk about their “Object-based” language. VB is now fully in the OO world. However, this can be a bummer for those VB6 folks who never really “got” the OO concept, and were just cranking out piles of spaghetti code.
Im sure it is a breeze ifor VB6 developers, but we used objects in Delphi before C# even existed.
2. Multi-language projects – the divide between the VC and VB aisles dramatically narrowed. Before, the pocket-protector crowd gravitated to the Visual C++ side of the house, and the folks who had actually been on dates were on the VB6 side. Now, if they were smart, the VC guys have moved on to C#, and the VB guys can now fully cooperate on the same projects. Heck, even you Delphi.NET folks are welcome!
We are able to mix Delphi, C++ and python already, we are not interesting in VB.
3. Desktop vs. web – the gap between the web folks and the desktop crowd also was narrowed. Using .NET, the desktop and web paradigms are much more similar than in the past. Most desktop developers have little trouble adjusting to the ASP.NET world.
Delphi comes with the Intraweb framerwork, all your delphi code can, and actually is rehused if we need to make web pages, actually it is and advantage not have to depend of IIS at all,
Delphi.NET is good but the depency of the .NET framework is a killer.
As I said, we developep faster with the same practices in Delphin than with C#.
Good to see it works for others.
Edited 2008-10-01 17:05 UTC