I just wanted to give everyone a quick heads up about the IADNUG meeting tomorrow night at the DMACC West campus. Nick Parker will be doing a presentation on dependency inversion in .NET. I hope to see you there.
This week I hit the 250 mile running mark on my Nike + iPod and I love it. I definitely hate running though, it’s rather boring and tends to be a lot of work, but adding a bit of technology to it that allows me to monitor my progress makes it bearable and keeps me in better shape. The Nike + iPod consists of a wireless sensor that attaches to your shoe, a wireless receiver that plugs into your iPod, and the nikeplus.com website. This wireless sensor is designed to slip under the sole of certain Nike shoes which now appear to be down to as cheap as $40, but when I got the device the cheapest shoes that supported it were $100 so I opted for the RunAway which cost me about $10. I read a few comments on the Internet saying these adapters don’t work because the sense works by pressure which you don’t get by strapping the sensor to your laces, but I haven’t had any problems. Now that the Nike shoes have come down in price though, I may get a pair and test out the difference. The sensor and receiver combo runs around $25.
Basically the entire add-on is a glorified pedometer, but it has some great features that allow you to monitor your running. First off you are able to monitor your running, while you’re running. With one push of the center iPod button, you get informed of your distance, time, and pace. You’re also able to monitor your overall running by syncing your runs up to the Nike+ website and tracking how well you’re doing over time. You can also set target goals on the website and track if you are meeting them.
Another nice feature is the ability to set up a power song. Mine is The Eye of the Tiger and it plays whenever I need a little extra motivation by holding down the center iPod button for a few seconds.
I love this add-on and the ability to track my running motives me to run more, but one thing I don’t like is the fact that I can’t change the song very easily while running since the iPod is strapped to my arm and I can’t see the screen. Hopefully this problem with be resolved if the Nike Amp+ ever comes out. It is a wristwatch like remote that allows you to control the iPod while running.
Check it out and I’ll let you know when I hit 500 miles.
This weekend I have been spending most of my time rebuilding my KnoppMyth box by installing the lastest version, applying the patch to handle the new tv lisings provider, Schedule Direct, and adding a new 500GB drive for recording tv. I wrote down most of the steps to get things setup and plan to do a blog posting about soon. At the moment what I did want to post is a link to a page on how to add a second drive to linux. They are clear instructions and worked nicely. I'm mounting this new hardrive to the /myth/tv directory so that all tv recording will go to it.
If you're trying to do exactly what I did, then you also need to set the proper permissions back on the /myth/tv directory or you will get a bunch of errors. First set the owner of the directory back to mythtv by doing this
chown mythtv /myth/tv
Then to get the permissions exactly back to where the were I added write permission back on for the group by doing:
chmod g+w /myth/tv
You can compare the before and after permissions and ownership my doing ls -l /myth before you make any of these changes and then again afterwards.
I said goodbye to Wells Fargo this week and hello to a new job that will hopefully fit my style much better. Wells Fargo just wasn’t hitting the mark for what I wanted to be doing professionally and I found myself doing a lot of content changes and minor code changes, many of which were in classic asp and .NET 1.1. I was about to start a new project from scratch in .NET 2.0 that would have taken the next two or three months to complete, but I realized that once that was finished I would probably be back to handling minor changes.
I’ve worked at two different large companies in my development career, and I have come to the conclusion that they do not respect the individual developer as an asset. Don’t get me wrong, in both cases my direct manager has been very good and I felt they considered me an asset, but in general I think management as a whole in a large company doesn’t care who the developer is sitting in the seat and thinks all developers have the same skill set. From what I have seen this is definitely not the case and except for a couple of very smart developers I have met, a significant number of developers in general really don’t seem to have the aptitude nor ambition to do their jobs
The biggest thing I think corporations with development departments need to start doing is charging their internal departments. A significant amount of money and time seems to get wasted with users and management abusing their development department by flip flopping on changes, and general indecisiveness. I think by billing the department that the application is for (at a reasonable rate) the requesting department would be more conscience of the time that they are utilizing for development. It’s very easy to take advantage of the development department, when you have no consequences from it.
So where are you off too, you ask? Well I’ve headed over to work with Nick at Two Rivers Marketing. It will be nice to finally leave my kakis hanging in the closet and be able to wear some comfortable clothes to work for a change. I’ve met several of the guys there through the IADNUG and felt they had the type of attitude that I was looking for. I hope that I can learn a lot there and maybe even teach them a few things.
Lastly I would just like to point out that most of us developer work at least forty hours a week, probably even more, so if you’re not doing something you are enjoying then you owe it to yourself to do something about it. If you’ve been telling yourself for awhile now that things will get better soon, or that a fun project will probably be around the corner, then you need to start considering a job change and at least stick your resume out there to see what the world has to offer you.
Take a look at the decTop. Its a small computer weighing three pounds that can run Windows CE or linux. It has 128mb of memory and 10GB for storage. With four USB ports you could easily add a wireless usb stick for network connectivity and do some cool things. I came across this on LifeHacker and my mind is full of ideas for it. Plus at $100 its not to bad of a price.
Well I thought I would be nice and put up the code sample for implementing FizzBuzz using and extension method in VB.NET. To read my full explanation of FizzBuzz and extension methods please read my previous post.
Those of you that know me, know that I could care less if you program in VB or C#, but I definitely prefer C# because it allows me to type a lot less. Extension methods appear to be no different are a bit more complicated in VB, requiring that they are in a module and decorated with an attribute.
Module ExtensionMethods
Sub Main()
For i As Integer = 1 To 100
Console.WriteLine(i.FizzBuzz)
Next
Console.ReadLine()
End Sub
<System.Runtime.CompilerServices.ExtensionAttribute()> _
Public Function FizzBuzz(ByVal Value As Integer) As String
Dim rtnVal As String = ""
If Value Mod 3 = 0 Then
rtnVal += "Fizz"
End If
If Value Mod 5 = 0 Then
rtnVal += "Buzz"
End If
If rtnVal = "" Then
rtnVal = Value.ToString
End If
Return rtnVal
End Function
End Module
For those of you that don’t know what FizzBuzz is, it became quite popular awhile back when Jeff Atwood posted to his blog a few quotes from people about interviewing candidates for programming jobs and the fact that many of them can’t code. (It’s possible that the topic originated from someone else. I heard it originally from Scott Hanselman’s blog who referenced Jeff.) FizzBuzz is a simple coding exercise where you write a loop that prints the numbers from 1 to 100, except if the number is divisible by three it outputs Fizz and if the number is divisible by five it outputs Buzz. If it is divisible by both you output FizzBuzz.
What is an Extension Method? An extension method is a new feature in .Net 3.5 that allows you to add methods to an existing object. It allows you to modify an object without needing to create your own version of it through inheritance. Why is this useful? I’m sure there are many reasons. One is that if you don’t have the ability to change the object that is being passed to your class then you can’t just use inheritance and create your own version of the object, but what you can do is create an extension method.
This example that I am going to walk you through will add a method to int called FizzBuzz. Calling this method will output a string with either the number of your int, Fizz, Buzz, or FizzBuzz depending on the criteria stated above.
This code for the extension method is fairly simple:
namespace ExtensionMethodExample
{
public static class CustomExtensionMethods
{
public static string FizzBuzz(this int value)
{
string rtnVal = "";
if (value % 3 == 0)
{
rtnVal += "Fizz";
}
if (value % 5 == 0)
{
rtnVal += "Buzz";
}
if (rtnVal == "")
{
rtnVal = value.ToString();
}
return rtnVal;
}
}
}
Here I have a static class called CustomExtensionMethods. Next, I have created a static method called FizzBuzz. The magic happens when I all this to the input parameter. I then calculate FizzBuzz on my input parameter and return the appropriate string. (I’m sure there is a better way to implement FizzBuzz.)
That’s it. I then utilitize the new FizzBuzz method like this(In my case I have created a simple console application):
using System;
using System.Collections.Generic;
using System.Text;
using ExtensionMethodExample;
namespace ExtenderMethodExample
{
class Program
{
static void Main(string[] args)
{
for (int i = 1; i < 100; i++)
{
Console.WriteLine(i.FizzBuzz());
}
Console.ReadLine();
}
}
}
It’s that simple. Drop this into a VS2008 console application and try it out. You will have complete access to your extension methods via intellisense. Happy coding.
Was this post helpful? Post a comment and let me know.
Well Greg Brill hasn’t responded to the email with my resume and I haven’t gotten any swag from DotNetRocks!, but you may have noticed that on episode #260 at five minutes and twenty seconds Richard Campbell read my email. I was a little surprised myself as I was only half listening while I was at work, but sure enough he read my comments on show #244 with Scott Stanfield. He didn’t read everything so here is my email in full:
Hey Guys,
I just wanted to let you know that the Scott Stanfield show was great. I am a .Net developer and have been listening to the big three for some time now (.NetRocks, Hanselminutes, and Runas Radio). My only complaint is that I need more content to listen to. I’ve tried a few other podcasts, but either the content or the sound quality sucked and I removed them from my subscriptions. After listening to the Scott Stanfield show I started thinking that you need to get this guy his own podcast. He knew a wide range of topics and obviously enjoyed talking about them. The way he presented information reminded me of Scott Hanselman as they both come across as very excited and enthusiastic about technology. What do you think? If you’re too busy to get it up and going then maybe you could hire a hot intern (or at least you could use it as an excuse to get one.) Just thought I would throw that out there. Keep up the good work.
Anyways, I’m going to officially start the campaign for the Scott Stanfield show. I don’t know If Scott wants to do one or not, but if I can get enough people to ask then maybe he will. My job is getting fairly boring and I need some interesting content to listen to so until I get a new job, some swag as a bribe to shut me up, or the Scott Stanfield show starts I’ll keep bugging for the new podcast. I’ve put together a survey to gather everyone’s opinion about the idea so fill out the survey and feel free to comment about it here.
If you haven’t been watching Slashdot at all or any of the many MythTV forums out there then you don’t know that we will all soon be losing our free XML feed of TV listings. Zap2it Labs originally provided the feed for free for noncommercial use, but they are going to stop providing this starting Sept 1st. Several individuals organized and have worked with Zap2it to keep the feed going and have created a new entity called Schedule Direct. Recently Schedule Direct reached an agreement with Zap2it to continue to provide the feed to open source DVR users through Schedule Direct. Of course, the rub is that it is going to begin costing $5 a month to subscribe to. According to the website they hope to turn this into only $20 a year, but they are waiting to figure out how many individuals sign up.
I don’t want to sound like a complete cheapo, but I can get a DVR from my cable company for $5 more a month, and the entire point of me originally building my own DVR was so that I would not have to pay a monthly fee and could instead use that money to keep adding to my DVR. Also, I have no idea how many MythTV and XMLTV users are out there, but someone is bound to make a decent profit off of this at $5 a month per person.
A few of the guys at work, including myself think that this is an area that Google needs to get into. Google’s all about providing different types of data, so why can’t they start providing TV data? It seems like the exact kind of service that they would provide for free. You could probably wrap some ads around it all and generate some very decent revenue. I even considered doing it myself, but I already have two projects going on at the moment along with a full time job, and I didn’t think I could put together a solution quick enough for it to be beneficial.
Anyways, this leaves me with deciding what to do next and I am strongly considering switching my setup over to Windows Vista Home Premium and using the Media Center functionality. From what I have heard it is pretty good, my only concern is if my hardware is all supported, which I can’t figured out without hunting down the details on everything, or just trying it out and killing my MythTV install. Of course, after Sept 1st my DVR is just going to be a big paper weight anyways. What do you think? Should I not be a cheapo and pony up, or stick to my guns and have a completely monthly fee free DVR?
As I mentioned in a previous post, a couple former coworkers and I are putting together an application that we are hoping to take commercial. This are going fairly well at this point, but perhaps a bit behind schedule as we are doing it entirely .Net 3.0 so there has been a small learning curve, not to mention the fact that we all still have day jobs. Of course, unlike me, my partners also have significant others to dedicate time to.
Anyways, one of the things I wanted to utilize for this project because of the team work environment and the disjoint schedules was a build server. (Okay, I also wanted to learn about the entire concept and functionality of a build server.) I used CIFactory for our build server which was created by Jay Flowers. CIFactory is basically a set of scripts that configures/installs the components of the build server so that you don’t have to put together all the integration scripts yourself. The package works quite nicely and sets up things link CruiseControl.Net, Simian, NCover, NDepend, and MSTest. I have it integrated with our Subversion repository, but it can also integrate with Visual Source Safe and other repositories. It also can integrate a lot of other tools besides what I have listed here.
At this point we are mainly using the setup to verify that the project is building properly so that someone doesn’t grab the latest code and get stuck with errors that some else has checked in. If a build fails an email goes out notifying everyone of the broken build so that no one gets updates and the person who broke it can get it fixed. (Getting people to respect the build server is a completely different complication and topic.) I want to get more familiar with the other tools, especially the ones that revolve around testing but I just haven’t had the time. I need to put together a list of things to learn to make myself a better developer over the next six months and add testing tools to the top of the list (along with Ruby).
If you want to get going on CIFactory there is an excellent DNRTV episode on it that I followed to get our setup going. I had it all setup in a matter of a few minutes and I really like the setup. Jay is also excellent at responding to any questions quickly and promptly. There is one negative to the setup though that I came across this past week that was a real disappointment for me. CIFactory does not support multiple projects. It creates a separate CruiseControl.Net server for each project and they won’t both run at the same time. It appears from the news group that someone has made some modification and gotten it working so that two projects are going to one CCNET, but the benefit of CIFactory is the fact that it pretty much just works. If I’m going to have to dig into scripts to get multiple projects working then I will probably eventually rebuild the build server without CIFactory.
CIFactory is an excellent starter package for getting up and going with a build server and if Jay gets the time to modify it to handle multiple project easily then it will be an excellent solution for a permanent builder server. My build server path will depend on if Jay gets multiple projects implemented before I get the free time to rebuild the server.
I use a hotmail account. I got it a really long time ago and now I'm stuck with it. If it had a feature to forward email I would have switched a long time ago and had it forward to something else, but it doesn't so I'm trapped. The new Windows Live Mail is a very good interface, but I prefer to not use a website to check my email when I'm at home so instead I use Outlook 2007, which by the way seems to be extremely slow even after I install a patch that was suppose to speed things up.
Anyways, ever since I switched to 2007 from Outlook 2003 I continuously get a prompt for my hotmail userid and password even when I click the remember password button. It was a huge annoyance and it finally got to me enough today to do some Googling. Here is the patch that appears to resolve the issue.
Things were even worse this past week when I couldn't even log into my hotmail account with Outlook, for some reason, and I was forced to use the web interface, but it appears that issue is finally resolved. Anyone know what was going on? I'm sure eventually I will be annoyed enough to switch my email address, but until then at least I don’t get prompted for my password anymore. (Knock on wood.)
Well I started Twittering last night. (Would that be the proper wording for it?) Twitter is a site that allows you to setup a page and post short message too it, similar to a blog, but the messages have to be under 140 characters. You can post from the website, instant messenger, or a text message from your cell phone. Basically intended to keep friends up to date on where you are or what you are doing. I'm using it to vent throughout the day about work and life, hence the reason I haven't posted the link to my Twitter account because I just may be venting a short message about you, but feel free to try and find it. In general it seems like Twitter is a general waste of space and time, but I also see where it could come in handy. It would be interesting to try Twittering a diet or spending habbits. It would also be nice to have all you friends on Twitter to know when individuals are as bored as you or up late at night, so that you know who you can bug.
On Wednesday night I learned a pretty cool trick at the .NET Usergroup meeting when Jon von Gillern was giving his presentation on RegEx. When selecting text in some applications like Visual Studio if you hold down the ALT key you can select a vertical column.
This is one of those things that you would probably never know about unless someone told you, so now you know.
If you haven't been keeping an eye on Google labs then you have missed the new entry of Google 411. Simply call 1-800-GOOG-411 and get business directory listing for any city and state. I have tried it several times and it works fairly well. It free and doesn't even have any ads so it's hard to complain about it. It's all based on voice prompts and will even dial the number for you. (This means be careful if you're goofing around and looking up things like the White House.) Currently it just has business listings, but I hope that they will soon add listings for people. As far as I am aware this is Google's first step out of the computer world and into a seperate type of market. Atleast as far as I know everything else at Google you access with a computer. The arena makes complete sense for Google though since it is all about data and they have lots of it. Where I would really like to see Google go next is tv listings. It's all just data as well and with Zap2it not providing a free feed after Sept 1st it leaves a hole in the US market to be filled. GIve Google 411 a try and let me know what you think.
If you've been watching closely today you saw that Google acquired GrandCentral.com. This site allows to get a phone number and then have that number routed to any number of other phone numbers you want such as cell, work, or home depending on rules you define. It has a host of other features as well such as recording the conversation and sending individuals you're trying to dodge directly to voicemail. I came across GrandCentral a few months ago and have been tempted to try it out, but I haven't had a chance to. If you're using it, let me know what you think about it.
I decided to repost one of my original blog posts on how to easily build and test your connection strings that was quite popular and somehow lost during the transition to the new site. This method is fairly simple and I use it all the time when I need to build or test out a new connection string. The first thing you need to do is create a new empty file and call it Test.udl. Normally I use Notepad to do this, but you can use anything you want.
Once you have done this your file will display the computer icon associated with .udl files and if you double click on the file it will open up the Data Link Properties window.
Next, you need to fill out your connection information including provider, server, database, and login information. In order to properly get your connection string you will need to check the allow saving password box. At this point you can click the test connection button to verify that the connection is successful. If it is not, you can continue to make adjustments to the Data Link Properties window until it is.
Finally, click OK to save the file, and open up the .udl file using Notepad. If you did check the allow saving password box you will be cautioned about the security risk of saving the password. Once you’ve opened the file you will see something similar to the follow:
[oledb]
; Everything after this line is an OLE DB initstring
Provider=SQLOLEDB.1;Password=yourpassword;Persist Security Info=True;User ID=sa;Initial Catalog=TimeTracker;Data Source=WDMPDC01
You have now created your connection string and can copy and paste this into your application. This is also an excellent way of testing connectivity issues from individual computers.
Did you find this posting helpful? Please let me know by posting a comment.
Scott Hanselman hit the nail on the head with his recent post. It gave me quite the laugh, but unfortunately it is so true (Just like when I watch Office Space). I was told the other day by a coworker that I over engineered a solution because I used an httphandler instead of just making a page. Anyone with a small understanding of ASP.NET should realize that using an httphandler when no content needs to be display is going to be more efficient. This “page” was going to be hit about every time someone accessed the portal and from what I have come across using an httphandler will increase performance by 5%-10%. Considering the development time was only increased by the one minute or so that it took to add the proper section to the web.config I would have to say that in this scenario the httphandler was the best way to go and I think that the solution could only be considered “over engineered” if you don’t understand it. Anyways, I’m beginning to think there is a case of ADD going on in my new environment.
Like I said in an earlier post I’m currently working on a side project with a couple former coworkers that I hope with become a success. This project has turned my dining room table into an ad hoc office several evening each week and caused the need to setup a development infrastructure to support the project with the backbone of that being the source control repository. I decided to use Subversion for the source control repository for the initial reason that it was free, but also because it seems to be fairly highly regarded in the community. I’ve used both Visual SourceSafe 6.0 and 2005 in the past and whereas I’ve never had any direct complaints with VSS it did always seem to be a bit of a task getting things setup initially when starting a project.
Subversion is working fairly well for us. Because we cannot always get together to work on the project I have http access setup and my partners can very easily sync their source code over the Internet. Subversion integrates nicely with Apache web server to do this. Let me point out though that I have all this running on Windows, not Linux. There are Windows versions of both Subversion and Apache. For the client side of things we are using both TortoiseSVN and AnkhSvn. TortoiseSVN integrates directly into Explorer and is handy for setting up new projects and handling the tasks at the file system level. AnkhSvn is an add-on to Visual Studio that integrates Subversion tasks into the right-click menu of Solution Explorer.
Anyways my goal in this post is to help direct anyone who is Googling trying to setup Subversion get going in the right direct. Ralph Willgos put together a great article on Code Project walking through the steps of getting everything setup so I will not bother rewriting them. It is a little old and I skipped the steps of setting up IIS and Visual Studio 2003, but the rest of the steps apply. I will caution you on a couple things. First make sure you are using the 2.0.x version of Apache, not 2.2.x and not 1.x. From my understanding and experience Subversion will only work with the 2.0.x version. The other thing I will caution you on is to not use the download links that Ralph provides. They have become out of date as newer versions have been released. I ran into very cryptic error messages when I had different versions of Subversion and TortoiseSVN trying to talk to each other and this caused several headaches. To get the files simply go to the links that I have provided and grab the latest versions.
That’s about it. Feel free to drop me a line if you get stuck with Subversion and I will try to help you out. Also, I’d like to read what everyone else has to say about source control in general and what they think about specific products so feel free to post a comment with your experiences. I’m also running a continuous integration server for our project, but I will go into that in a later post.
Well I’ve definitely by slacking on my blog posts and I’m sure my frequent visitors have been a little disappointed in me. I have managed to get a new style to the site up and I am still trying to get all the content migrated over.
It’s been a chaotic few months for me as I unfortunately have been forced onto a new career path. Professional Edge closed its doors recently due to what I have been told was a large gap in the sales pipeline. I really enjoyed being a consultant there and was disappointed to leave behind the environment and the individuals I worked with. The sudden change has forced me back into the cooperate environment which strangely enough I swore I would never go back to.
Apparently the job market right now it very hot for .NET developers as I had several offers for development positions. The one thing I did notice from talking with potential employers is that there are definitely what I would consider a lot of bad practices going on. This includes things ranging from not using source control, to poor architecture (or none at all), to even the production database being the same as the test database.
Anyways, while I feel out the cooperate environment in my new job I have also partnered with two other former Professional Edge employees to work on a project that we feel there is a business need for so I’ve been putting in some long days lately and probably will be for a while.
On completely another matter, and to keep this blog post useful, I wanted to post the links to the three podcasts that I make a point of listening to every week. I guarantee that you would benefit from listening to them as well.
The first one is Hanselminutes by Scott Hanselman. Scott’s podcast is excellent and covers a wide range of technology topics. He normally covers something related to Microsoft development, but sometimes like today, he talks about the latest tech items that are out, such as Microsoft Surface. The best thing about Scott is that when listening to his podcast you feel like you’ve sat down with a friend to discuss technology over a beer. (Scott, I’d love to buy you a beer sometime.)
The second one is .Net Rocks by Richard Campbell and Carl Franklin. Richard and Carl do a great job of pulling in the top .Net people to interview and do a great job exploring two different topics in two shows each week. If you’re not listening to this one then I don’t know how you can call yourself a .Net developer.
The last one is RunAs Radio with Richard Campbell and Greg Hughes. Whereas this podcast doesn’t cover .Net it does cover a lot of technologies, both hardware and software that you’ll be interacting with and integrating with as a developer. Today’s podcast was on the latest version of IIS which every ASP.Net developer will be working with in the future.
Listen to these podcasts! They will keep you informed of what is going on in the .NET community and in technology in general. They all add up to about three hours a week so I normally have plenty of time to listen to them while I am exercising. Just to reiterate you are a fool if you are not listening to these podcasts. They are a fast and easy way to try and stay up to date on what is out there and what you should be investigating further. Technology is changing so rapidly that it is impossible to know what is all out there, let alone learn it all, so I find that listening to these allows me to found out what is there and filter out what I want to explore further.
That’s about it for now. I wish all my former coworkers and friends from Professional Edge the best of luck in whatever they end up doing. As for everyone else, drop me a line if you want to contract out some development work to me.
Well, I feel it is important to put together a toolkit of controls to use when developing for a client. One of the controls I have created is the Textbox Filter Extender which will allow you to limit a textbox to numeric, alpha, or alphanumeric characters. I've post the control for anyone to have access to and use. Free free to check it out over in the development tools section. If you have any questions, comments, or suggestions for improvements please post them to the proper forum section.
Well I passed the 70-228 exam which means I am one test away from my MCDBA. The best part of it is that I am done with the SQL tests so now I get to focus on what I prefer more, which is development. To study I used the Microsoft Press book: MCSA/MCSE/MCDBA Self Paced Training Kit: Microsoft SQL Server 2000 System Administration. I selected this book simply because it appeared to be the best of the poor quality of books out there. It was a rather boring read, and the hands on training involved running a bunch of prewritten scripts that were sometimes not explained very well. I passed though, which is what matters.
What I have decided though is that instead of providing me with an evaluation copy of SQL Server 2000 (or for that fact the software for what ever test you may be studying for) the publisher should provide the reader with a virtual pc image of what is required for the test. For example all of the exercises in this book expected SQL Server 2000 running in a domain with a specific domain/computer name. Also, several exercises required specific domain users be setup. I would assume that most people do not have a setup like this unless they are studying on company time, and even if I did have a setup like this I wouldn't want to be messing it up adding unneccessary users for use with the exercises. Now that Virtual PC is free it would be great for this particular test to have been provided with a virtual image of Windows Server 2003 (feel free to have it expire in 6 months like it would if I downloaded the eval copy) along with the install files for SQL Server 2000. (For this particular test installing and setup are part of the test hence not having the software already installed. For the 70-229 test the server could already be installed as that is out of scope of the test.) The server could then be preconfigured with the proper naming conventions and users which the setup of is completely out of the scope of the test. This would allow the user to focus more time on the topics covered in the exam and not on the environment. I would have to think that something like this would make all of us very happy. Let me know what you think!
Well I realize that I have fallen extremely far behind and have not finished my tutorial on putting together y DVR setup. Hopefully I will get to it someday, but now that the steps are no longer fresh in my head it may not happen until I need to build a new box. If you do have any questions feel free to post a comment and I’ll help you out the best I can.
What I will post about are the steps I need to take to update my box for the new DST settings. (I hope the energy savings from the change makes up for the pain in that it has been, and the old electronic equipment I have that will now always be off.)
First off check to see what your system is currently set to:
#zdump -v /etc/localtime | grep 2007
Using this command from root will provide you with the dates your system has set for daylight savings time. If it isn’t set to start in March then continue.
Go to the home directory for root, use ‘cd $HOME’ if you are still logged in as root and download the patch:
#wget http://debian.oregonstate.edu/debian/pool/main/t/tzdata/tzdata_2007b-1_all.deb
Now you just need to install the patch.:
#dpkg --install tzdata_2007b-1_all.deb
Finally, if you run the check again:
#zdump -v /etc/localtime | grep 2007
You should get the following output:
Sun Mar 11 07:59:59 2007 UTC = Sun Mar 11 01:59:59 2007 CST isdst=0 gmtoff=-21600
/etc/localtime Sun Mar 11 08:00:00 2007 UTC = Sun Mar 11 03:00:00 2007 CDT isdst=1 gmtoff=-18000
/etc/localtime Sun Nov 4 06:59:59 2007 UTC = Sun Nov 4 01:59:59 2007 CDT isdst=1 gmtoff=-18000
/etc/localtime Sun Nov 4 07:00:00 2007 UTC = Sun Nov 4 01:00:00 2007 CST isdst=0 gmtoff=-21600
If only it was as easy to do with all my other equipment. Specially thanks to everyone at the KnoppMyth Forums for posting the information on this.
Well, the guy who has taught me everything I know about .NET, Bigyan, has taken off for a month to go back to Nepal to get married, leaving me with some very big shoes to fill while he is gone. While he is gone I have adopted a new philosophy that I am trying to follow, WWBD, What Would Bigyan Do?
While working on some bug fixes for a web application that we currently have in pilot testing for a client, I found myself working hard trying to prevent a post back, which I knew is exactly what Bigyan would do to increase the quality of the user experience. I needed to create a custom validator in ASP.NET and could have easily just made it a server side validation, but this would require a post back before telling the user that there was an input error. The way to eliminate this post back is to create a client side validation function using JavaScript.
This first thing that is necessary is to add a CustomValidator control to your page and to associate it to a control on the page by setting the ControlToValidate property.
Next, I would typically set the text of the CustomValidator control to an *. This is the text that shows up on the page where the custom validator exists. I normally position the CustomValidator either next to the control I am validating or next to the label for the control I am validating. The * then indicates to the user that there is an error with that field.
Following setting the text I set the ErrorMessage property of the CustomValidator to the error message I want to display. This message will then appear in the validation summary control if you have placed one on the page.
Now its time to do a little coding. First thing you will want to do is code up the server side validation for the validator. This part of the validation is important in case the user's browser doesn't support JavaScript or the user attempts to bypass the script on the client side.
In my particular case I was validating a rich textbox control that was required. In my particular case the control would be blank either if it was actually blank or if it had "<p> </p>", hence I could not just use the RequiredFieldValidator. To code the server side check you use the ServerValidate event of the CustomValidator control and check the value property of the args objects that is passed in by the system. Perform your logic and if the value passes set args.IsValid = True, otherwise set it to False.
Protected Sub CustomValidator1_ServerValidate(ByVal source As System.Object, ByVal args As System.Web.UI.WebControls.ServerValidateEventArgs) Handles CustomValidator1.ServerValidate
If String.IsNullOrEmpty(args.Value) Or args.Value = "<p> </p>" Then
args.IsValid = False
Else
args.IsValid = True
End If
End Sub
This function will then execute when a button that has its CausesValidation property set equal to true is clicked. Then, at any point in time if you want to check if all the validators on a page are valid you check if Page.IsValid is equal to true.
That’s it for handling the server side validation, but the next step is to handle the client side validation so that the client is not forced to look at a post back before knowing that they have an invalid input field.
To handle the client side validation you just need to write a simple JavaScript function that takes two parameters, sender and args. You can then evaluate args.Value against your logic, and set args.IsValid equal to either true or false.
function TxtQuestionValidate(sender, args)
{
if (args.Value == '' || args.Value == '<p> </p>')
{
args.IsValid = false;
}
else
{
args.IsValid = true;
}
}
You have a couple options for placing the javascript function. The simple way would be just to put it right into the .ascx page. Whereas this is easy, I try to steer away from this as you would then need to place this function in every page where you need it. I prefer to write a service layer function that returns it to me as a string and then insert it into the page on the page load event as follows:
Page.ClientScript.RegisterClientScriptBlock(Me.GetType(), "TxtQuestionValidate", Services.GeneralService.GetTxtBoxValidateJavascript(), True)
Finally, you need to tell Studio to use this function as the client side script so on your custom validation control simple set the ClientValidationFunction to the name of your JavaScript function. You now have a custom validation with a client side script to prevent a post back, which I am pretty sure is what Bigyan would do. I hope he’s having a good time in Nepal, but I’ll be happy to have him back in 25 days. Its boring going to Starbucks by myself!
Well, I've been slacking on getting the next blog post out on the steps I used to build my DVR. I will hopefully get time to post the next steps soon, but for now I just wanted to post an updated on my hardware setup. Like I had posted earlier, I been wanting to increase the memory in the computer. It started out with 512MB which was working fine, but I am sure the system would suffer if I was recording two shows and watching a recording. I went ahead and added 1GB more of memory, giving me a total of 1.5GB. I didn't do any specific benchmark testing, but the picture did seem to look a little better. I really wanted this memory boost for when I add an HD tuner card to the system, which I hope to do in the next couple months.
The other thing I added was another fan to the system. I have been experience system lockups about every 2 to 3 weeks and I think it is because the system is overheating. The box lives on the bottom shelf of a wood entertainment stand so I have a feeling the heat is building up in the enclosed space. I hope this extra fan with resolve the issue.
That's about it for now. For those of you keeping track of the total investment so far that I have put into the system, the fan was $2.99 and the memory was $89.99. I got both from Newegg.com and including shipping they came to $99.27. Hopefully I'll have time to get the next few DVR steps up soon.
Well, for sometime now I have been working on building a DVR from a computer. After several months I am finally at a point where I consider the project done, and I have decided to post a how-to so that others can build one of their own and hopefully bypass a few of the pitfalls that I hit and benefit from my time Googling.
I chose to build my DVR using MythTV, a Linux DVR platform, and specifically I used the KnoppMyth version. Why didn't I use Windows Media Center? Well, I'm a big fan of the open source movement, and KnoppMyth offers a wide variety of features for free. Several co-workers have pointed out that I could have saved a lot of time and money by either buying a Tivo for about $100 at Best Buy, or paying an extra few bucks a month and get a DVR through my cable company. I chose to build my own because it has a lot more functionality than you basic DVR. I also have the ability to rip my music collection to it as well as my DVD collection. It has a screen that shows me the weather for my area, as well as a feed reader. Since its open source there are many add ins for everything you could desire. I also have the ability to link several of then together (which I plan to do) so that I can stream shows to different rooms in the house. Not to mention the fact that since it is a standard computer I can continue to upgrade it as much as I want. Also, I have no monthly fees so I could rationalize a little more up front cost.
This is the first of a several part post to walk through the setup and configuration settings I used for my setup of KnoppMyth. Hopefully it will help others through the problems I encountered along the way.
In this first post I would like to list the spec's for my setup and a few of the requirements that you should take care of before getting started.
The computer I am using is a Compaq Presario SR1910NX I picked up at CompUSA for $200. The general specs are as follows:
- Motherboard: Asus A8N-LA
- Processor: Sempron (P) 3200+ 1.8 GHz
- Chipset: GeForce 6150 LE
- Memory: 512 MB
- Storage: 120 GB SATA 3G
- Media: CD Writer DVD Combo
- Sound: Integrated
- Network: Integrated 10/100 NIC
There are no particular reason I picked this computer to use. I actually bought it to use for a web server, but reallocated it towards this project instead. The one thing you should steer clear from is a VIA chipset. Most of them just don’t seem to work with the software.
The rest of the default specs are fairly standard, but feel free to view the link above to get all of the other specs. To turn the computer into a DVR I added several other hardware pieces. The first necessary component is a TV tuner card. I wanted to be able to record two shows at once, or watch one show and record one so I got the PVR 500 by Hauppauge. This is a dual analog tuner built onto one PCI card which I purchased from NewEgg for $144.98 including shipping. I chose it for the fact that it would only use one expansion slot in my computer. One problem I ran into with this card is that Hauppauge had recently switched it to use a Samsung chip which apparently poorly affected the video quality from previous versions. Thankfully I found a patch that corrected the video issue and I am happy with the quality now. From my research the single turner version, the PVR150 does not cause the same video issues as the PVR500, but I have not used it.
The next necessary addition was a video card. The computer did have integrated video, but with only VGA out. I added a card with both svideo out and DVI out. I currently have a 42 inch widescreen HD TV that the DVR is hooked up to. I am only connected to it by svideo at the moment, but I would eventually like to switch it over to the DVI. The card I got is an NVIDIA Geforce 6200LE 256MB (128MB on board) made by BioStar with a PCI Express interface. I also got this from NewEgg for $41.98 including shipping. I was very happy with the card as it worked right out of the box without much of any configuration. One thing I learned purchasing this product is to look at how many MBs are on board. As you can see this card says it has 256MB. If you look closer you will find that it only has 128MB on board and it uses system memory for the rest.
The last piece of hardware I added was a remote. The remote I bought was the Streamzap PC remote from Streamzap Inc. This remote connects to the computer via USB and worked out of the box. I bought this remote from Provantage for $29.58 including shipping.
This is currently my entire hardware setup. There are two spots where I would have made modifications and plan to in the future. The first is more memory. In my opinion it would be best to have at least 1GB and I plan to go ahead and just add another 1GB once I find a good deal on it. This will give me a total of 1.5GB of memory. When I am recording I can see that memory usage is maxed so I think the system would benefit from more memory. The second thing I would modify is the storage space. I currently have 120GB. This is fine, but I have to make a point of getting shows deleted promptly. I think it would be best to have 250GB. With this amount I don't think anyone would have very may problems unless they are ripping a lot of music and movies to their system. I plan to add more storage space sometime, but it isn't a major priority.
The last thing I would like to eventually add is a HD over the air tuner so that I can record my local channels in HD. These run about $100 and I will post more specifics about this once I pick one up and get it installed.
The last thing I want to go over are a few pre-installation things that need to be done. The first is to download and burn the iso for the KnoppMyth install. The current version which I am using is R5D1. Download it and burn a bootable installation cd.
The second thing you will want to do is sign up for an account on the forums for KnoppMyth. I received a lot of help from the users there that was very valuable. The other place with very helpful information and how-tos is the Wiki. These two places provide extensive information for setting up a KnoppMyth system.
The last thing you will need do to is sign up and configure you account on Zap2it Labs. This is the account your system will use to get its TV listings. I don't specifically remember what the sign up process is as it has been a long time since I filled it out, but I don't recall any issues. There is a how-to on the Wiki about installing that references using the code 'TGYM-ZKOC-BUTV' so you may need it, but I don't recall.
That's about it for now. Next time I'll walk through the basics of handling the install and point out the areas where I ran into some troubles.
Well, a couple of days ago I passed the Microsoft 70-229 exam, Designing and Implementing Databases with Microsoft SQL Server 2000 Enterprise Edition. The book I used to study for this exam was by Microsoft Press, Microsoft SQL Server 2000 Database Design and Implementation. I wasn't very impressed with this book as it left a bit to be desired. It seemed to only make reference to some important items on the test, and it would introduce new items in script examples with no explanation. Plus, I found the sample questions in the book not very reflective of the style of questions on the test. I realize that it is difficult to cover the entire topic of SQL databases in one book, but I feel this book could have done a much better job. Steer away from it if you can.
If you do anything with computers then you probably get this all the time. It seems like once people get a computer, even though they come to depend on it, they exert very little effort to maintain it, leaving it open to viruses and spyware. I always get asked to take a look at these computers and a significant part of the time this requires the use of the original Windows CD which the owner never seems to be able to find. After being stuck at a computer this past Monday with no way to get it booted and a corrupt dll that needed to be replaced, I decided that I should start putting together a "tool kit" that I could have ready to be more prepared for these free service calls.
The first item I have added to my toolkit is BartPE. Run, BartPE, point it at a set of Windows install files and it creates and burns an ISO to either CD/DVD. This CD is then a bootable version of Windows that can be placed into any computer and booted to. You then have access to a functional OS with explorer access to the problem computer's original Windows partition. BartPE allows you to include many other programs on your bootable OS such as a virus scanner and Adware which can come it handy when someone hasn't been taking care of their computer. I checked several reviews of BartPE and most people gave it five out of five stars. I think BartPE is one of the best things I've come across lately and I plan to make it the first and permanent addition to my "toolbox". Feel free to post a comment with anything else you think would make a good addition.
I've begun to notice that several people are unaware of a very important feature of terminal services that can come in handy. When a server is setup for remote administration mode Microsoft allows two remote administration sessions. There are at least two issues with this The first one is that there are only two, so if you have three different administrators trying to remotely log into the same server then the third administrator who tries is going to receive a message informing them that they cannot log in. The other issue is that some applications output their messages to the console, so if you are logged in remotely you may not be able to view a critical message.
So what's the solution to this? Well what few people seem to be aware of is that you can remotely connect to the console session. This allows you to view any error messages that may get outputted directly to the console, and this also gives you one more remote session to utilize.
In order to access the console session by way of terminal services you'll need to invoke mstsc from the run prompt using the /console parameter, enter in your address as usual, then log into the remote system.
It is extremely simple, yet can come in very handy, especially when s few administrators need access to the same server. This trick could save you a long walk to the server room. Just remember if you're logged into the console session remotely that means no one can log in directly at the keyboard. If someone has the console locked when you try to log in remotely to it you will be prompted if you want to continue and end their session. This will work on servers in application mode, but you must be an administrator on the server to be able to log into the console session.
Well, I recently passed the Microsoft 70-290 exam, Managing and Maintaining a Windows Server 2003 Environment and the primary book I used for studying was a book published by Que titled MCSA/MCSE Training Guide: Managing and Maintaining a Windows Server 2003 Environment. The book was an easy read, but I don't feel it was enough alone to pass the test even though I did manage to do so on the first try. I do consider the book a good introduction for someone who is not very familiar with Windows server environments. It goes over all the basics of the features of 2003, but I don't feel the details were covered enough to allow someone to successfully pass the test. My suggestion would be to go over plenty of practice tests and have hands on experience with Server 2003. You can download a free six month evaluation period version so you should have sufficient time to go through all the examples in the book and try setting up your own functioning server environment. The test has a handful of simulation questions so someone with hands on experience should feel more comfortable on these questions. The book assumes previous knowledge of activity directory and domains so you should consider it a good idea to pick up a book on these topics before beginning this book.
Originally the mail server for cltralt313373.com was running on a FreeBSD box using the qmailrocks setup. IMHO, this is the best mail setup you can have and it’s all free except for your hardware. This setup has built in spam and virus filters, webmail access, web administration features, and an excellent mailing list setup. So what's the only drawback to it? Well you have to setup and administer it yourself. Not a problem for day to day activities, but when it comes to upgrades and troubleshooting errors it can get a little hairy. The support mailing list it full of very helpful people, but trying to get things fixed when you really need to get them fixed fast can be complicated unless you know all the in's and out's of the multiple software packages that are involved in the install.
As a result ctrlalt313373.com has been switched to Gmail For Your Domain Beta. Granted it doesn't have nearly the amount of features that the qmailrock setup has, but it does have the power of Gmail, and best of all it's all running on the very reliable servers at Google. There are a few significant drawbacks that will hopefully be resolved soon. The first one is that total storage space is limited to 2GBs. This is extremely low for most domains. Also, the quantity of email accounts for the domain is limited to 25. This as well is fairly limited for most domains. One final thing that needs to be added is secure IMAP access to the servers as for now there is only the Gmail web interface and POP access. (Okay, one more. Multiple domain aliases would be nice as well.)
So why do I think this is so great? Well because of the reliably of the servers and the ability to uses the features of Gmail. Also, the folks at Google are always working hard to add new features so you know something excellent is on the way….
|