Codeapalooza Coverage

by Nick Saturday, September 06, 2008 10:10 AM

I drove down to Wheaton, IL for the Chicago .NET User Group's Codeapalooza.  They have a lot of sessions, and its impossible to see all them, here is the coverage of the ones I chose to go to.  I'll be updating this post after every session, so feel free to check back throughout the day.

SQL Server 2008 for Developers by Sam Nasr:  Best Quote - "I drank the punch a long time ago, so I only really know Microsoft stuff."  Covered quite a lot from platform features, to T-SQL, data types and reporting services.

Talked about new data auditing features, but then mentioned the four common fields that everyone adds to a table (CreatedBy, CreatedDate, ModifiedBy, ModifiedDate) but didn't really talk about how the two work together, or if the new data auditing features is a standardization of that classic DBA paradigm.  Auto Synchronization of data with a SQL Server Compact Edition database.  Cool!

Definitely some cool new T-SQL features, like the ability to pass in an entire table into a stored procedure.  Would that really simplify writing bad code?  FileStream Object Storage looks cool as a good alternative to binary blobs, with better management than just storing a path as a string in a table.  Finally you can create and initialize a variable in one statement!  Plus some new operators like +=, and the ability to insert multiple rows in one statement.  Both of those are a long time coming.  Intellisense!!!  Thankfully the Intellisense improvements are with SQL Server Management Studio 2008, even if you're connecting to a 2005 Server.  Very nice.  There is of course more (like grouping sets), but some of that stuff is beyond me since SQL is not exactly my forte.

HierarchyID looks very much like an XML Data Document (especially with the methods), but it doesn't support moving to XML built in.  Very strange.  But for creating tree structures in your data, it looks very cool.  Nasr concentrated on using it for organizational charts, but tree structures are a very common solution to a lot of programming problems.  This simply creates first class support within T-SQL for a common database paradigm for a parent child relationship with an ParentID field.

WPF Demystified by Dave BostBest Quote - "Where are the designers?  It's hard to tell, I don't see a turtleneck or a beret."

WPF in and of itself is a huge topic, so you're not going to cover it in an hour.  So this was treated more as a showcase of what WPF can do.  He interestingly focused on developer vs. designer, and how that is handled by WPF and new technologies (no more battleship gray).  Focused, once again, on the fact that .NET 3.0 runs off of the .NET 2.0 runtime (in other words... there were additions... but not changes).  Important note... the .NET 3.5 SP1 requires .NET 2.0 SP1 runtime.  You cannot run 3.5 SP1 applications on the 2.0 standard runtime (without the service pack).

Unfortunately... this talk wasn't as in depth as I'd hoped.  It really was the same overview presentation I'd seen many times.  Here's XAML... Windows Forms isn't dead... etc.  There is Expression Design, Blend and Web (which replaces Frontpage... Frontpage is Dead!).  There is also Expression Media.  Lots more tools in that family than I was aware of (with virtualization so they can run on Mac).  Best question:  "Is this Standards Based, or Standards Compliant?"  Excellent spin!  I always ask about SVG because XAML looks so similar to SVG.  Not surprisingly, nobody at Microsoft seems to realize (or is willing to admit) that XAML was invented 10 years ago and was called SVG.

He shows some cool sample XBAP applications on the internet that demonstrate the power of XAML.  The coolest was the British Library's "Turning the Pages".  You can use it to virtually read their collection of precious old books which are normally under glass and unavailable to regular library visitors.  There is also Vertigo's Family.Show which is a XAML application to visually represent a family tree.  And of course, don't forget Scott Hanselman's Baby Smash which is great for keeping your kids out of trouble when they start pounding on your keyboard.

So how do you start?  Learn XAML first, and then find out what parts of XAML work in WPF and in Silverlight.  The goal by Microsoft is to make Silverlight and WPF more similar.  Its not always possible because the browser environment restricts us in many ways.  Then he worked through one of MSDN's Virtual Labs, which you can do to also.

Of course, Expression Blend doesn't have built in integration with source code control.  Supposedly it's coming, but I consider that a must have if you want designers and developers to work together well.  I also don't like how the XAML designer in Visual Studio is so pared down.  It forces developers (who don't have a designer on staff) to buy two applications.  Doesn't Microsoft realize that developers are cheap?!

ADO.NET Entity Framework with Fakher Halim:

This guy has his presentation almost way too prepared.  He might as well have been reading off of a teleprompter.  Maybe not good for this type of group.  Actually, we never really got to see much code at all.  He mostly explained why we should be using something like the Entity Framework.  For God's sake, this guy is drawing a keyboard and mouse on the white board!  I wish they had brought somebody in to actually talk about the Entity Framework.  This dude is a PhD, talking at such a high level that it's useless.  A wasted hour.  I feel sorry for this guy because he's got to watch people file out one by one.

As it turns out, this guy was also sitting in the front row of the WPF talk I mentioned earlier, and he kept on asking questions that were totally off topic and taking us away from what were were all there to talk about.  He kept complaining that forms weren't looking like forms any more, and that we were breaking the old fashioned user experience paradigm.  At some point, old dogs have to learn new tricks.

ADO.NET Data Services with Jim Fiorato:

This is all new to .NET 3.5 SP1 (and used to be called Project Astoria).  It's a pattern to present data on the web in a way much simpler than SOAP.  The results of data calls in web services are definitely not human readable.  So web services are great, but they're not easy to consume unless you're in Visual Studio and possibly Java.  This fits really well with the new MVC and JSON technologies coming around.

Data Services can either return data in an ATOM format or in a JSON format for use in Javascript.  I'd never really seen JSON before, but he had an example of a data structure in JSON... and it's super readable.  This all fits very well with the newer paradigm of RESTful services coming around.  I'm going to the REST presentation later, so this will bookend very well with that.

This has not been a good afternoon for presentations.  This guy has a Mac Book, and for some reason couldn't get it to work with the projector.  So they had to bring in another laptop, and get Virtual PC installed... and well... the presentation isn't as good as it could have been.

Once he finally got everything up and running, he basically created an Entity Framework project and exposed it value a WCF service using ADO.NET Data Services.  Essentially all this does it allows you to translate a URL into a LINQ query seamlessly.  He spent half the time just typing URL's into a web browser and showing the results.  No code for parsing the URL, or any code written.  All that query string functionality comes built in out of the box.

The idea of a query interceptor was pretty cool, in that you can now perform extra logic before the query takes place and then modify the query based on that logic.  For instance, you can modify a GET to only return data based on someone's permissions.  There was an excellent question on Transaction support.  For this model, there is no real support for Transactions.  At that point, you are best going to full blown Web Services.  This is great really only for fast read only data access.

Rest and JSON using WCF and ADO.NET Data Services with Larry ClarkinBest Quote - "You can explain any software concept using Star Trek.  After all, Capt. Kirk was the first blogger."  It's tied with "The new Hello World is - Let's build a blog!"

We had low expectations to begin with since he came into the room and said "Well, I haven't finished my presentation yet."  Way to represent Microsoft Larry!  And true to form, he actually started out by not talking about REST and JSON, but instead started talking about photography... specifically showing Sea Dragon.  Looked pretty cool.  Deep Zoom is actually part of this, and is built into Silverlight.  It is similar to Google Maps in that it delivers the different pieces of resolution in tiles, but it is much smoother.  It might actually use the new JPEG 2000 in order to do part of the render.  He also showed Photosynth (which I hadn't heard about).  It allows you to splice together different angles of a location, and using the EXIF data out of the picture along with spatial recognition, it creates a 3D scene.  One note that Larry mentioned is that you shouldn't use cropped photos, because the focal length is not synced with the result of the crop, and that screws up the software.  Finally he showed AutoCollage, which loads a whole slew of photos and tries to auto create a collage based on what the software thinks is interesting.

So this ended up being a much more overview look at REST (Representational State Transfer) and JSON (Javascript Object Notation), and even some old school (at least in the .NET time frame) serialization.  Of course, I remember writing MFC serialization code in C++.  He started by showing what the XML Serializer will do for you out of the box... which has been available since .NET 1.0.  I do think he spent too much time talking about existing technologies, when the topic was supposed to be REST and JSON.  Though the attributes you can apply to the classes to control XML Serialization also apply to REST serialization since REST is still XML, though much simpler than SOAP.  Is REST better than SOAP?  REST is much simpler than SOAP, but SOAP has more features for transactions and such.

Why JSON?  Well for one, it avoids the angle bracket tax.  However, its also great for use with Javascript since all parsers know how to deal with JSON.  After all, it's called Javascript Object Notation.  So there is no need to parse the data on the client side, so application development is faster and easier.  It's also faster to run on the client running Javascript.  But don't use JSON for a thick client or for Server-Server communication.  For that, you ought to be using XML (through SOAP or REST).

Then he showed some code on how to deliver JSON through a WCF service.  Overall though, he gave a great overview presentation.

How Much JavaScript is Too Much?

by Nick Wednesday, July 09, 2008 9:30 AM

One of the things I try to do on my blogs (especially my other blog) is to get as much of the processing to be done server side, with as few outside dependencies as possible.  There are a few reasons for this.  One is that I can better control any problems that may occur because the source is more easily known.  Second, it allows for more server side caching.  Third is that it is a pendulum reaction to when I used to host on Blogger and everything I wanted extra had to be provided through a third party via JavaScript.  My old Blogger blogs got to be really slow as I tried to add more and more features.

I like to think that I keep a nice balance of server side and client side processing for all my sites.  One of the political blogs that I keep tabs on is Wigderson Library & Pub.  James is a big fan of advertising, and JavaScript.  Luckily I only read his site through his RSS feed... because if I actually had to go to his site to check his content, it would drive me nuts.  For fun, I decided to "battle" Jim using Webslug, which measures comparative load time performance between two sites.  Here are the results:

NickVsWiggy

 

You're reading that right... 192 seconds, or over 3 minutes for the page to finish loading.  Now then... it's not as bad as it seems, since the positioning of the JavaScript allows the page to render the majority of its content before that.  But still... at what point do you declare that you have too much JavaScript?  And for folks like Jim, who still use third party applications like Blogger for all his content, what are the alternatives to speed load time?

LINQ and Stored Procedures Not Always Magic

by Nick Monday, June 09, 2008 3:45 PM

I had an interesting experience while trying to import a stored procedure into a LINQ to SQL Classes Designer surface today.  Normally this is a pretty straightforward process.  First you open Server Explorer, then go to one of your connections, find the stored procedure, and then drag and drop it onto the designer surface.  Boom, it's suddenly available from your DataContext as a method. 

That's exactly what I did today, except when I created a var for the stored procedure result set, and then added a foreach to loop through the rows in the result set... nothing showed in my Intellisense.  Huh?  A little hover magic and I saw that my generated method was returning an int.

After some investigation, I found that the designer has a hard time handling stored procedures that use temporary tables in them, as it throws off the procedure meta data.  As it turns out, there are two possible solutions.  First, you can use a table variable instead:

DECLARE @tempTable TABLE ( ... )

Your second option is to continue to use a temporary table, but hand modify your dbml file using an XML editor.  Simply right click on the file in Solution Explorer and choose Open With and then choose your favorite XML editor.  Mine is Notepad2.  Search through the file for your stored procedure, which for a procedure named "storedProcedureName" might look like this:

<Function Name="dbo.storedProcedureName" Method="storedProcedureName">
  <Parameter Name="Parameter1" Parameter="Parameter1" Type="System.Int32" DbType="Int" />
  <Return Type="System.Int32" />
</Function>

Then remove the <Return ... /> element and replace it with an <ElementType> node which may look like this:

<Function Name="dbo.storedProcedureName" Method="storedProcedureName">
  <Parameter Name="Parameter1" Parameter="Parameter1" Type="System.Int32" DbType="Int" />
  <ElementType Name="storedProcedureNameResult">
    <Column Name="Result1" Type="System.Int32" DbType="int NOT NULL" CanBeNull="false" />
    <Column Name="Result2" Type="System.String" DbType="varchar(10) NOT NULL" CanBeNull="false" />
    <!-- ... -->
  </ElementType>
</Function>
This second method is the one that I used since my temporary table will hold thousands of rows, which is too inefficient for a table variable.

This Looks Super Cool

by Nick Friday, April 11, 2008 9:10 AM

Scott Hanselman has a great overview post up on ASP.NET Dynamic Data.  It's a new preview ASP.NET framework that works with .NET 3.5 which allows you to mark up your business objects with meta data that will be used by your GUI code to control what type of control is used to view and validate your data.  This is huge if you have multiple screens that show the same data points in multiple ways, because it allows you to centralize this code into your business objects so that you can change your visualization in one location, and have it spread across all your pages.  Wow!

Now the only question I have is why doesn't something like this exist for WinForms or XAML?  Or does it already exist and I just don't know it?

Real World

by Nick Saturday, May 12, 2007 10:56 AM

This is interesting.  I'm currently reading a Self-Paced Training Kit book for an exam towards my MCPD.  It's talking about different features in .NET 2.0 (which seems kind of boring since I really already know all of this).  However, what is good so far about this book are the occasional interludes of "Real World" information thrown in by the authors that break from the normal "Oo Ra Ra, Go Microsoft" information.  One of them is talking about Generics (a feature that I love) which shocked me:

I haven't been able to reproduce the performance benefits of generics; however, according to Microsoft, generics are faster than using casting.  In practice, casting proves to be several times faster than using a generic.  However, you probably won't notice performance differences in your applications.  (My tests over 100,000 iterations took only a few seconds.)  So you should still use generics because they are type-safe.

Huh?!  Granted, the type safety aspect is a huge benefit.  However, I can't tell you how many times I've heard from Microsoft people about the huge performance benefits.

To be honest, I always suspected a problem in that respect, but I always thought it was me.  I had written a .NET 1.1 library a while ago that implemented different types of Binary Search Trees and a Skip List.  When .NET 2.0 came out, I decided to make a 2.0 version using Generics as an exercise to learn the new feature.  When I did some timing comparisons, I found that it was either a little slower, or largely the same depending on the test.  I always figured it was something in my implementation, or my timing code.  Maybe it wasn't me after all.

Fun With Reflection... Reference vs Value Types

by Nick Monday, May 07, 2007 9:20 PM

So here is an interesting little quirk I found regarding Reflection in .NET.  I was writing a serialization library that was capable of reading and writing to a CSV file format, and also to a fixed width file format.  The project I was working on had various CSV and Fixed Width formats to deal with, so we wanted a nice and generic library to read and write with.  Moreover, the code I was replacing basically just parsed everything into a string, and then there would be tons of logic that simply indexed into a string using a constant to represent the position in the row.  We wanted each record in the file to be read into a strongly typed structure.

I decided to make use of Reflection so that you could create a data structure that looked like this:

[TextSerializable]
public class Person
{
    [TextField(0)]
    public string Name;

    [TextField(1)]
    public int Age;

    [TextField(2)]
    public DateTime DateOfBirth;
  }

And then easily read it in by doing this:

TextReader reader = new StreamReader( "TestFile.csv" );
CSVSerializer<Person> ser = new CSVSerializer<Person>();
Person p = ser.Deserialize( reader.ReadLine() );

Sounds pretty easy right?  But what if you want your target data type to be a struct instead of a class?  Should be pretty easy right?  As it turns out, there is a little known quirk in how you use reflection to set Property and Field values that makes a big difference due to boxing.

The serialization class that I wrote uses reflection to find all the Fields and Properties that have been marked with the TextField attribute.  Then during the deserialization process, it uses the PropertyInfo.SetValue (or FieldInfo.SetValue) method to set the value on the newly created target type.

Here is the trick.  You have to know whether the target type if a class or a struct.  If it's a class, then you can pass in an object reference.  If it's a class, then you have to store the variable in a ValueType variable.  Otherwise the structure will be boxed, and during the boxing/unboxing process, the value will be lost!

It's weird... you call the exact same SetValue method... there is not even a special overload that takes a ValueType vs Object type.  However, it makes all the difference in the world.  Here is part of the code from the Deserialize method.  TargetType is the generic type that gets passed in during the declaration (in the above example it was Person).  I stored the Type variable in _type.

TargetType returnObj = new TargetType();
            
ValueType  returnStruct = null;
if ( _type.IsValueType )
{
    object tempObj = returnObj;
    returnStruct = (ValueType)tempObj;
}
// Parsing code here
if ( _type.IsValueType )
    AssignToStruct( returnStruct, fieldObj, attr.Member );
else
    AssignToClass( returnObj, fieldObj, attr.Member );

And here is AssignToClass and AssignToStruct:

private void AssignToClass( object obj, object val, MemberInfo member )
{
    if ( member is PropertyInfo )
        ( (PropertyInfo)member ).SetValue( obj, val, null );
    else if ( member is FieldInfo )
        ( (FieldInfo)member ).SetValue( obj, val );
    else
        throw new TextSerializationException( "Invalid MemberInfo type encountered" );
}

private void AssignToStruct( ValueType obj, object val, MemberInfo member )
{
    if ( member is PropertyInfo )
        ( (PropertyInfo)member ).SetValue( obj, val, null );
    else if ( member is FieldInfo )
        ( (FieldInfo)member ).SetValue( obj, val );
    else
        throw new TextSerializationException( "Invalid MemberInfo type encountered" );
}

Notice how they are identical, except for the type being passed in?  It's absolutely crazy making to have this copy and paste code, but it's necessary.  The other crazy making part is that FieldInfo and PropertyInfo don't have a common base which has SetValue in it.  For whatever reason, all languages in .NET treat Properties and Fields as identical syntactically, but they are completely different reflected.  More copy and paste madness.

Never Stop Learning

by Nick Tuesday, May 01, 2007 12:18 PM

One of the things they told us at Engineering School was that they weren't just teaching us "stuff", they were teaching us "how to learn".  You can never stop learning.  New technologies, languages, frameworks, methodologies are always coming around.  Either you keep up, or you fall behind.

I like to think that I've done a good job at keeping up.  My resume is filled with a vast array of TLA's that have shown that not only do I give lip service to learning, but that I've done it, and applied to projects.  I've worked on C++ with MFC, then COM and ATL, and now .NET.  Now it looks as if something big is coming around the corner.

Time to start learning again.

More Things Visual Basic Left Out

by Nick Thursday, April 12, 2007 9:32 PM

I mentioned last week about how Nullable Types weren't really in Visual Basic.  You know what's also not there?  Iterators. No Yield statement.  Since I often times have to switch between C# and Visual Basic, I find these differences to be quite frustrating at times.  I really wish Microsoft would do a better job of maintaining parity of features between the various .NET languages they support.

Nullable Types Not Quite There

by Nick Monday, April 02, 2007 8:59 PM

Lately I've finally been working with Visual Studio 2005 and .NET 2.0.  Previous to this, I'd been working with 2003 and 1.1 almost entirely, and only read about (though extensively) and played with 2.0.  Since I only play with C#, I got used to all the new features it added, and wrongly assumed that Visual Basic .NET brought the same features to the table.

As I should know by now... that was a poor assumption.  I found this out today when I was trying to use Nullable types to deal with database access.  I've been working with VB most recently, so I created an empty playground project to play with the features and see how they worked.  Then I realized I had no idea how to declare a Nullable type in Visual Basic.  I'd only done it in C#.  That's no problem... a minute later I had written this sample code:

Dim n As Nullable(Of Integer)
Dim m As Nullable(Of Integer)

n = 7
m = 3

Console.WriteLine(n + m)

Hmmm... Nullable(Of Integer) isn't nearly as nice looking as int?, but VB syntax has always been more bulky to me than C#?  Hold on a second!  Why doesn't that last line compile?  Doesn't VB.NET have the same type coercion features of C# for nullable types?  That would be a definite no.

For the record, here is the identical code in C#:

int? n = 7;
int? m = 3;

Console.WriteLine( n + m );

Now then... doesn't that look nice?  And it compiles and works like you'd expect it to also.  Is that so much to ask?  So the reality is that Visual Basic only supports Nullable types because it happens to be implemented using Generics in the CLR, so Visual Basic didn't do any extra work to support them... bare bones.  But that's all you get.  Just bare bones support without any of the extra niceties that you'd expect to find.

If You're Lucky It Will Blow Up

by Nick Tuesday, March 27, 2007 8:49 PM

In my tenure as a software engineer, I've seen a lot of poorly written code, especially in my C++ days.  Pointer magic that had no business being compiled, and error handling that was slim to non-existent.  Even in the world of .NET, it's still possibly to write dangerous code:

try
{
   // Call a method may throw an exception
}
catch ( Exception ex )
{
   // Eat the exception
   System.Diagnostics.Debug.WriteLine( ex.Message );
}

How many times have you seen code like that?  Hell, sometimes I don't even see the Debug.WriteLine.  Sometimes they just eat the exception.  Exceptions aren't a bad thing.  In fact, having an exception occur is a good thing.  I still remember something my advisor said during a class talking about pointers at MSOE... "If you're lucky, it will blow up."

The worst errors that can occur are the ones that don't have outward signs.  Perhaps you did some bad pointer arithmetic, but instead of throwing an exception because you illegally accessed memory, it simply reads that memory and a counter equals 50 instead of 5.  Your program could appear to function perfectly, but will provide incorrect results.  Worse yet, when an symptom does finally appear... the symptom will so far away from the cause, that you may never find it.

Eating exceptions is about the closest you can come in .NET without trying really hard.  People are so afraid of seeing a box with that red X, that they eat them left and right without realizing that having an immediately failure that is easy to identify and fix is preferable to a bug that is impossible to track down.

Exceptions happen for a reason... and that reason should be taken seriously.  If you can write code to handle that exception in a catch block, then by all means do it.  But if your "exception handling" code simply boils down to logging it and continuing... do yourself a favor and rethrow that exception (or don't catch it at all) and put your program out of its' misery.  You'll thank me in the long run.

About Me

Nick Schweitzer Nick Schweitzer
Wauwatosa, WI

Contact Me
I'm a Software Consultant in the Milwaukee area. Among various geeky pursuits, I'm also an amateur triathlete, and enjoy rock climbing. I also like to think I'm a political pundit. ... Full Bio

Community Involvement

Twitter

Archives

Flickr Photos

www.flickr.com
This is a Flickr badge showing public photos and videos from Nick_Schweitzer. Make your own badge here.

Standard Disclaimers

Disclaimer
The opinions expressed herein are my own personal opinions and do not represent my employer's view in anyway.

© Copyright 2017 Nick Schweitzer