Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

I work at a large healthcare organization as a mid-level software developer. I h

ID: 642160 • Letter: I

Question

I work at a large healthcare organization as a mid-level software developer. I have over 10 years experience in the IT industry using Microsoft technologies (ASP.NET & SQL Server).

When I go to conferences, code camps, .net user group meetings, I hear of all kinds of new tools and technologies: MVC, LINQ, Entity Framework, WCF Web Services, etc. I guess you could say I'm in my comfort zone using the same old stuff from asp.net 2.0. I use typed datasets for my data access layer. I use web forms and feature rich server controls with master pages. I know how to use plain old SQL and create queries in my typed datasets to get at data my applications need.

Throughout my career, I'm always sensitive to not become obsolete with my skill set. What I currently use works fine and my development time is fast. But I'm concerned that if I were to be laid off, I would be asked in interviews how many MVC apps I've written. Or how I am with LINQ or WCF web services.

I know that it doesn't matter how many conferences, books, or videos I watch on some new technology...I have to implement/use it or it simply won't sink in. Also, managers who interview don't care how much someone reads up on something, only real use and experience with a technology.

I have a new project to write. I've gone to my manager and have asked for additional time for the project for learning/implementing technology I may not be familiar with. Our organization encourages its employees to "learn and grow" and to continue are education. But I always get resistance when I ask for more time to ramp up on something new to implement. My manager is asking for concrete business reasons for implementing these new technologies. I don't have business reasons. My reasons are because I don't want to become obsolete. I could say it would make the project more maintainable in the future by other developers since at some point people could stop using these older technologies, but that' about all I can think of. Does Linq/Entity Framework/MCV apps perform better? So much so that the customers (users in departments I'm creating this app for) need? I doubt it.

I'm interested in you guy's thoughts on this. Do many of you have similar plights with trying to use newer upcoming technologies? I doubt I'm on the bleeding edge of technology, either. Are there "business reasons" that you would bring to light for using these technologies?

Thanks in advance! Sorry for the long wall of text.

Explanation / Answer

Not becoming obsolete sounds like a pretty good business reason for using new technology. It flows both ways. You don't want your skill set to become outdated, but your boss should also be concerned that he might not be able to find employees willing or able to work on outdated technology. If the majority of developers and employers leave an old technology for a new one that might be reason enough for your company to do so too.

To specifically answer your question you should evaluate each one of those technologies individually to see if they meet your needs. I'll give you my take, but you really need to spend at least half a day to research and evaluate each one them as they apply to your specific business needs.

LINQ: This is something you can use, even without entity framework. LINQ is a technology used to work with collections of data, and you can use it in your applications even if you don't use it for loading that data from the database. Do yourself a favor and learn how to use lambda expressions and LINQ extension methods. It will save you time, make your life as a developer easier, and reduce the amount of code you need to write.

Entity Framework: This seems to be the future for data access in the Microsoft world. Most of the new frameworks, technologies, and tools from Microsoft are designed to work with entity framework. It's not perfect, but it's a lot nicer than using datasets, especially if you use LINQ to entities. One big business reason for using entity framework is that it reduces the amount of SQL code you need to write, as the framework will generate it for you. In my experience, most developers aren't very good at writing SQL anyway (and most companies don't have a dedicated DBA), so for most applications entity framework should make things faster and more efficient. Entity framework will also let you work with POCOs which have less overhead and are easier to work with than datasets.

MVC: This one might be hard to justify, as most applications might not benefit much from it. Based on the latest job postings I've seen, MVC is still in the minority (although it is gaining ground fast). For a most business applications MVC might be overkill, and dragging a few controls onto an aspx page will suffice. MVC has a learning curve to it, and to be productive and get the most out of it you really have to understand HTTP, HTML, CSS and JavaScript. MVC works well when you need to have a really customized web application where performance is a big priority. If that is not the case, and the employees don't have much experience with it, there probably isn't a strong business case for using it.

WCF Web Services: Do you have to provide data to remote client applications? WCF is probably the way to go. Are you just writing a web application that will run on the same server or local network as your database? Don't use WCF, you don't need it and it will only serve to complicate things with needless abstractions.

In short, use a new technology when it makes sense, and take the time to find out when that is. It takes a lot of time to learn new technologies, but it shouldn't take much time to evaluate them and learn if they will benefit your specific situation. This is something the higher ups in your company should already be doing, but if they aren't then you need to do it and then take the time to educate them on what you have learned.