Wednesday, December 14, 2011

Continuing with WattDepot: hale-aloha-cli-tiger

As part of our continuing development with WattDepot, my group and I were given the task of adding some new functionality to another project called hale-aloha-cli-tiger. We implemented three new commands to their CLI system: set-baseline, monitor-power, and monitor-goal. Here are the commands and a short description of each:

set-baseline [tower | lounge] [date]
This command defines [date] as the "baseline" day for [tower | lounge]. [date] is an optional argument in YYYY-MM-DD format and defaults to yesterday. When this command is executed, the system should obtain and save the amount of energy used during each of the 24 hours of that day for the given tower or lounge. These 24 values define the baseline power for that tower or lounge for that one hour time interval. For example, if lounge Ilima-A used 100 kWh of energy during the hour 6am-7am, then the baseline power during the interval 6am - 7am for Ilima-A is 100 kW.

monitor-power [tower | lounge] [interval]
This command prints out a timestamp and the current power for [tower | lounge] every [interval] seconds. [interval] is an optional integer greater than 0 and defaults to 10 seconds. Entering any character (such as a carriage return) stops this monitoring process and returns the user to the command loop.

monitor-goal [tower | lounge] [goal] [interval]
This command prints out a timestamp, the current power being consumed by the [tower | lounge], and whether or not the lounge is meeting its power conservation goal. [goal] is an integer between 1 and 99. It defines the percentage reduction from the baseline for this [tower | lounge] at this point in time. [interval] is an integer greater than 0.
For example, assume the user has previously defined the baseline power for Ilima-A as 100 kW for the time interval between 6am and 7am, and the current time is 6:30am. If the goal is set as 5, then Ilima-A's current power must be 5% less than its baseline in order to make the goal. At the current time, that means that Ilima-A should be using less than 95 kW of power in order to make its goal.
It is an error if the monitor-goal command is invoked without a prior set-baseline command for that [tower | lounge]. Entering any character (such as a carriage return) stops this monitoring process and returns the user to the command loop.

Our group implemented the commands according to the specifications, and they all work as expected. There were some problems due to the previous state of the hale-aloha-cli-tiger project, but our group was able to work through it. Some of the problems were low test coverage, and bad error handling left over from the previous project. Some of these problems are still occurring, but we were not expected to fix the problems because we inherited them from the previous project.

I would say this system satisfies the three prime directives, that is, the system successfully accomplishes a useful task, an external user can successfully install and use the system, and an external developer can successfully understand and enhance the system. Despite some of the errors, the newly implemented commands are useful for retrieving energy data from the Hale Aloha Towers, and we were obviously able to enhance the system as external developers. There are also several guides on the project site that should help any external user successfully install and use the system. Based on that alone, the project satisfies the three prime directives.

Monday, December 5, 2011

Watt Depot Technical Review




As you may recall from my previous blog posts, I've been working on a command-line interface program which uses the WattDepot library to collect information about energy consumption for several residence halls at the University of Hawaii. I've learned a lot about software development throughout this project, and now I've been given the opportunity to review a similar system developed by another group which accomplishes the same task, to determine whether the system satisfies the Three Prime Directives. Here is a link to the project site I am reviewing.

Review question 1: Does the system accomplish a useful task?

Here are the available commands supported by the system:

> help
Here are the available commands for this system.
current-power [tower | lounge]
Returns the current power in kW for the associated tower or lounge.
daily-energy [tower | lounge] [date]
Returns the energy in kWh used by the tower or lounge for the specified date (yyyy-mm-dd).
energy-since [tower | lounge] [date]
Returns the energy used since the date (yyyy-mm-dd) to now.
rank-towers [start] [end]
Returns a list in sorted order from least to most energy consumed between the [start] and [end] date (yyyy-mm-dd)
quit
Terminates execution
Note: towers are: Mokihana, Ilima, Lehua, Lokelani Lounges are the tower names followed by a "-" followed by one of A, B, C, D, E. For example, Mokihana-A.

There are a few problems with the system in question. Some major functionality, including the "daily-energy" and "rank-towers" commands, is missing or broken, when it should be present and functional according to the system requirements. Furthermore, there is a bug in the "daily-energy" command that results in the error message "Date must be before today" when the given date argument is actually a valid one. In short, I could not get the "daily-energy" command to execute properly no matter what input I provided. Additionally, the system does not handle some exceptions when dealing with user-input and instead may crash if the user provides "bad" input.


The other two required commands, "current-power" and "energy-since", are implemented and function properly, except, as previously mentioned, the system may crash if the user provides bad input or incorrect arguments. Because some major functionality is broken or missing, and because the system does not handle bad user input in a graceful manner, I would conclude that the system is somewhat useful, but could definitely be improved.


Review question 2: Can an external user can successfully install and use the system?

The project site home page provides a short, clear description of the system and its purpose. Although there is no sample input and output provided on the page, it is still pretty clear what the system does based on their description. There is also a User Guide wiki page that provides details on how to download, install and execute the system. However, I found that the instructions were not completely correct, as it said the .jar file used to run the system would be in the /build/jar directory, when it was actually in the top level directory (there was no /build directory in the distribution I downloaded). There was no compilation necessary since they provided an executable .jar file, so it was pretty easy to run the system as a user. Furthermore, the download link was labeled with the version number, so users can easily keep track of what system they are using.

Review question 3: Can an external developer successfully understand and enhance the system?

The Developer's Guide wiki page provided clear instructions on how to build the system from sources, as well as the project guidelines for development, including the implementation of JUnit tests for all new features. It also provides guidelines for formatting code (including a downloadable file for auto-formatting within Eclipse), adding new issues, and verifying the build before committing changes with "ant -f verify.build.xml". I was able to determine that the developers use automation tools such as Checkstyle, Findbugs, PMD, and JUnit.

Lastly, there is a link to the Continuous Integration project build on Jenkins. From the wiki page I was able to conclude that the developers use Issue Driven Project Management as their development process, although this is not explained in detail on the page itself. There were no specific instructions on how to generate JavaDoc documentation, but I was able to accomplish this without any problems and I feel most Java developers would be able to as well. The documentation was well-written, and I felt the developers used descriptive names throughout their system, such that the purpose of a method, class or variable could be determined implicitly from its name in most cases.

I was able to generate coverage information about the system using Jacoco, and found that their tests only covered about 21% of the code. I saw that the command manager class, as well as each respective command, had its own JUnit test class associated with it, but some of the tests were not named correctly, and thus may not have been counted in the coverage. This likely contributed to the low coverage percentage.

I found that the source code followed coding standards and made effective use of comments for the most part. The code was easy to understand and I feel like an external developer could understand the system fairly easily by looking at the code and documentation.

I read the "Issues" page on the project hosting site to determine which parts of the project were worked on by each developer. I could tell that all three developers worked on the project equally by the amount of issues attributed to each member. Furthermore, I believe it would be fairly easy for an external developer to determine who to ask if they had a question regarding a particular part of the system.

I had a look at the Continuous Integration server associated with the project and saw that anytime there was a build failure it was corrected soon, or at the very least, someone worked on getting it corrected as soon as possible. It was clear that the system was worked on in a consistent fashion, and every commit was associated with an appropriate issue, which means the developers did a good job of breaking the system up into issues of manageable size.

From these results I would conclude that a new external developer could successfully understand the current system and enhance it with relative ease. I think the documentation, comments, and the readability of the source code would be pretty helpful to a new developer, and they would be able to successfully contribute with the current team to improve the system.

Tuesday, November 29, 2011

Watt Depot Group Project

For the past couple of weeks I've been working on a group project expanding on the WattDepot service that, if you recall from my previous blog posts, collects electricity data from meters and stores it in a database. This allows other tools to retrieve the data for visualization and analysis.

Our group created a command line interface program to help understand various aspects of energy use in the Hale Aloha residence halls at the University of Hawaii at Manoa. The user can call different commands and retrieve energy data with output that is easy to read and nicely formatted.


Furthermore, our program is modular and uses a reflection-based command manager that allows for new commands to be added easily to the system. Because the command manager scans for commands upon execution and does not hold a static list of the commands, no further changes to the code need to be done for new commands other than the command implementation itself.

This was my first experience with issue-based project management and working with a group using tools like subversion and google project hosting. Overall it was a really positive experience, and I know I will use the skills I learned in the future when building software in a group environment. My group members communicated effectively and I believe the quality of our software is pretty good. We implemented all of the required functionality as well as the reflection-based command manager which was not required, but offered for extra credit (Thanks Toy!).

Overall this project was a positive experience and one of the better group projects I've been a part of, simply for the fact that communication was not a problem. I've had my fair share of bad group projects where communication is a problem, and they can be pretty painful experiences. Thankfully were we able to work together effectively and communicated to each other at each step of the way.

Project site: hale-aloha-cli-tnt

You can download a release of the program from our hosting site here.

Tuesday, November 8, 2011

Hawaii's Energy Future

Hawaii is a unique place unlike anywhere else in the world. But the challenges it faces and will continue to face in the near future regarding energy needs are just as unique, and that can be viewed as a blessing as well as a curse. On one hand, there are so many opportunities for people from a wide range of professions to become involved in improving and innovating Hawaii's energy future. Yet, on the other hand, there is a sense of urgency regarding energy in Hawaii because of our dependence on imported oil for our energy needs.

This means that if we fail to develop sustainable energy in a meaningful way, and before it's too late, the consequences for Hawaii will be far more severe than it would be in other places, like the mainland U.S., for example. Hawaii already has some of the highest gas prices in the nation, and we see those costs not only at the pump, but at the grocery store and just about everywhere else as well.

One thing I've been exposed to regarding Hawaii's energy future, and that I think is really exciting, is the fact that Hawaii can potentially develop just about every renewable energy source there is, from solar and wind, to geothermal and more. This means that we can become experts on all of these if we devote the necessary resources and effort they require. Furthermore, we will be able to export that expertise abroad as the need for renewable energy continues to grow elsewhere.

What are some of the problems facing renewable energy in Hawaii? So far it seems there are definitely some environmental and cultural issues surrounding the topic. Some are strongly against building wind turbines on certain islands or in culturally significant locations. Some don't think it's fair for one island to have wind turbines in order to power other islands. I think these sort of issues will begin to take a back seat once that "sense of urgency" thing I previously mentioned begins sinking in (when gas prices hit $10 a gallon, etc.). When that happens, people will check their priorities and begin to see that renewable energy is of paramount importance.

I think it will take a lot of cooperation from everyone if we are to solve the energy problem in Hawaii. Furthermore, I think that educating people about energy, and whether or not we will be able to do that effectively, will prove to be one of most important factors in whether or not we improve Hawaii's energy future. We can all make a difference just by conserving energy ourselves and being more aware.

WattDepot Katas

To begin our exploration of energy-related programming, my fellow ICS classmates and I have been tasked with hacking out 6 programming exercises, or katas, that deal with WattDepot.

What is WattDepot you ask? Well, here is part of the description from the host site:

"WattDepot is an open source, RESTful web service that collects electricity data (such as current power utilization or cumulative power utilization) from meters and stores it in a database. The data can then be retrieved by other tools for visualization and analysis. It is designed to provide infrastructure for experimentation and development of "Smart Grid" applications."

So as you can see, WattDepot is a useful tool for viewing energy consumption data. Here are the 6 katas we were required to do in order to become more familiar with the WattDepot library:

Kata 1: SourceListing

Implement a class called SourceListing, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and their descriptions, sorted in alphabetical order by source name.

I finished this kata during class. It took me about twenty minutes and was pretty straightforward.

Kata 2: SourceLatency

Implement a class called SourceLatency, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and the number of seconds since data was received for that source, sorted in ascending order by this latency value. If no data has every been received for that source, indicate that.

This kata was also straightforward. It took me a half an hour or so.

Kata 3: SourceHierarchy

Implement a class called SourceHierarchy, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a hierarchical list of all sources defined on that server. The hierarchy represents the source and subsource relationship between sources.

This was the only kata I didn't complete. I couldn't figure out how to get the subsource data that I needed in order to print the hierarchy the way I wanted.

Kata 4: EnergyYesterday

Implement a class called EnergyYesterday, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and the amount of energy in watt-hours consumed by that source during the previous day, sorted in ascending order by watt-hours of consumption. If no energy has every been consumed by that source, indicate zero.

Katas 4, 5 and 6 together took me about three hours altogether. They were pretty challenging because this was the first time I dealt with timestamps or dates in Java, and I found them to be very non-trivial.

Kata 5: HighestRecordedPowerYesterday

Implement a class called HighestRecordedPowerYesterday, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and the highest recorded power associated with that source during the previous day, sorted in ascending order by watts. Also indicate the time when that power value was observed. If no power data is associated with that source, indicate that.

Kata 6: MondayAverageEnergy

Implement a class called MondayAverageEnergy, whose main() method accepts one argument (the URL of the WattDepot server) and which writes out the URL argument and a list of all sources defined on that server and the average energy consumed by that source during the previous two Mondays, sorted in ascending order by watt-hours.

This kata was hard because you had to figure out how to determine the day of the week in order to find the average energy for a Monday--another non-trivial task. This kata took me about an hour.

From these katas I learned that energy data manipulation can be really useful. I think its important that the software analyze the data and create output that is easy to read and understand. That way it can be utilized fully by other professionals and not just other programmers.

Monday, October 24, 2011

5 Things ICS Students Should Know in Order to Pass the Midterm and Maybe Graduate One Day

Midterms are upon us, which means it's that time of the semester again. Sleep-loss is at a peak and attention spans are running dangerously low around campus. In order to protect ourselves against impending doom, my fellow ICS students and I have been working in collaboration to produce a study guide, to which I am contributing the following five questions.

1. When starting Ant, you can select which target(s) you want to have executed. What happens if no target is given?

A: When no target is given, the project's default is used.

2. What is one example of manual quality assurance techniques?

A: Writing units tests with JUnit. Conducting code reviews.

3. Why is manual quality assurance bad for finding low level code defects?

A: It is difficult/expensive for finding low level code defects. It must be redone for all projects.

4. In Java, when should you use Enumerated types (rather than, say, an ArrayList)?

A: Whenever you have a well-defined, fixed set of values which are known at compile-time.

5. What is the convention for naming packages?

A: Use reversed internet domain names as package prefix, ex. "edu.hawaii." Use a single lowercased word as the root name for each package.

Wednesday, October 19, 2011

Host with the Most

My beloved robot, BattleBot, who has been the topic of most of my recent blogging, has a new home on the interwebz. That's because I've recently created a Google Project Hosting site in his honor so that other, inferior robots may look upon him and become inspired by his awesomeness...

In all seriousness, the reason I did this was to gain experience with configuration management systems and subversion, neither of which I had ever used prior to this. After downloading the SmartSVN client and interacting with my subversion repository, I quickly realized how much of a "game changer" configuration management really is. Having worked on a few projects with small amounts of code in a group before, I know how hard it is to manage the sharing of files and how difficult it can be to track changes and stay in sync. Thankfully, subversion makes collaboration much easier. It was simple to set up and get running, and although there are some practices that I will need to get used to (running 'verify' before every commit, etc.), overall it really does make life easier.

One of the things that I really liked about the Google Project Hosting is the ability to track changes to the source code and even view the changes in the source code directly in a side-by-side view. The idea that if my partner makes a change, I can view that change without opening Eclipse or downloading any files is really nice. However, I was a little surprised that it also showed every change I made to the wiki pages as a revision as well.


Overall, I think configuration management and subversion are really going to be helpful once I start collaborating more with other students and working on group assignments. I think it will allow me to focus more on writing code instead of spending too much time emailing files and trying to stay in sync. It definitely relives some of the headaches that can occur when collaborating.

BattleBot Google Project Hosting Site: http://code.google.com/p/robocode-ajo-battlebot/


Tuesday, October 11, 2011

Robocode Competitive Robot

Over the past week or so, I've been creating my own competitive Robocode robot for an upcoming competition against other software development students. My previous blog posts have touched upon the basics of Robocode and build systems using Apache Ant, and the competitive robot is supposed to be a culmination of everything I've been learning about so far.

The bad news? I'm competing against grad students (shudders) as well as other undergrads, so my poor robot stands little chance of coming out of the competition in one piece. Especially if anyone adopts a strategy similar to the sample robot simply known as "Walls" (I never knew I could dislike a virtual robot as much as this one), who I am yet to beat with my robot that I've named "BattleBot."


With my robot, the first thing I tried to do was think about movement. I figured that sideways movement, or strafing, would be best for dodging enemy bullets. Thus, I programmed my robot to square off perpendicular to the other robot the entire time. Rather than moving each turn on its own, my robot only moves when it detects an "energy drop" (the enemy's energy level decreased from the previous turn), which allows it to dodge enemy bullets. The idea is that firing a bullet depletes a robot's energy, so we can use that information to guess that the enemy has fired a bullet.

In order to target the enemy robot, I simply built upon the movement strategy which I had already implemented. Because my robot is always squared off sideways to the enemy, I just turn the gun 90 degrees at the start, so that it's pointed at the enemy, and then lock the gun so it turns with the robot as it adjusts its position. That way the gun is always pointing at the enemy. This strategy works well against robots that don't use a lot of random movements. I tried to improve my strategy by using linear targeting to shoot moving targets, but my implementation was not accurate enough so I ultimately did not use the code.

Because my robot's gun is always pointed at the enemy, I decided to fire every turn if possible. My robot uses bullet power proportional to it's distance from the enemy in order to conserve energy (the farther the enemy, the less chance my robot has of hitting it). Also, my robot does not fire if its energy is below a certain level so that it doesn't run out of energy.

Of the sample robots included with Robocode, my robot can reliably beat all of them except for Walls. Against Crazy and SpinBot, my robot wins about eighty percent of the time (which I consider to be pretty reliable), and always beats RamFire, Fire, Corners, Tracker and SittingDuck. I think the reason my robot struggled against Walls was because it is always moving and my robot struggles with moving targets. The linear targeting strategy which I implemented improved my results against Walls but actually was worse against the more simple robots, so I eventually ditched it altogether.

In order to test my robot, I implemented six tests: two acceptance tests, one unit test, and three behavioral tests. The acceptance tests were used to verify that my robot could reliably beat some sample robots. This made the development process faster and easier since I could quickly test changes to my robot's behavior to see if they improved its performance against other robots. The unit test verified the output of one of the methods that my robot calls when it fires a bullet to ensure that it uses proportional power based on enemy distance.

The behavioral tests made sure that my robot behaved like I intended. For example, one behavioral test tested that the robot had moved when it detected an energy drop. Another tested the bullet power to verify that proportional power was being used in firing, and the last one made sure that my robot did not continually run into walls.

Overall, developing this competitive robot has really taught me a lot. I improved my skills in Java and Ant, became comfortable with build systems, wrote some tests for the first time, and become more familiar with the Eclipse IDE, among other things. This was a valuable learning experience, and I can't wait to see to what implementations my classmates have come up with for the Robocode competition.

Thursday, September 29, 2011

Becoming Familiar with Build Systems: Apache Ant

This week, I began my first foray into using build systems with Apache Ant, and I must say it was a bit confusing and frustrating at times. The task seemed simple enough at the outset: create a few build scripts to compile and run a sort of "Hello World" java program, as well as a few other things.

I would guess that no more than two minutes passed before I encountered my first error. But this being my first ever experience with Apache Ant or build systems in general, it was to be expected. At one point I remember thinking to myself, "I thought this was supposed to make life easier?" as yet another "BUILD FAILED" message stared smugly back at me. But if theres one thing I've learned from my experiences with trying new things related to programming, it's that there is always a learning curve, and that you have to struggle through it the first few times in order to get better.

All joking aside, eventually I was able to complete the assignment after some headaches. However, I think these katas (practice exercises) taught me a few valuable things. The first, and most obvious thing I learned was the syntax required for Apache Ant. Personally I feel its more important to get a general feel for how things work rather than memorize the specific syntax. I think I was able to do that as well through these katas.

Another thing I learned was that working with filepaths can be a little confusing at times, because there are two filepath types: relative and absolute. I learned that its pretty easy to type the wrong file path and have to deal with some cryptic errors because of it.

Overall, these katas helped me visualize how large scale projects need build systems to satisfy Prime Directive #2, which checks that a system can be installed by users or other programmers. I think this will come in handy later on when I start writing some bigger programs.

Monday, September 19, 2011

Getting Familiar with Robocode

If you haven't already heard of Robocode, it's an open-source game where the goal is to develop a robot battle tank to battle against other tanks written in Java or .NET. The battles happen in real-time and are generally fast-paced and pretty fun to watch.

The game is meant to be educational and fun, and it is--but that doesn't mean things don't get more advanced. Those who participate in Robocode tournaments often use advanced algorithms and techniques such as tracking their opponents moves. Yet at the same time, a beginning programmer can create a simple robot in a few minutes.

To get our Software Engineering class started with Robocode, Professor Johnson had us create a few basic Robots as sort of training methods or "katas" that would allow us to become familiar with the basic movements or actions that a robot can do. While I got a late start and only finished 8 of the basic 13 robots, I still think I might be able to get it together in time for our class Robocode tournament next week.

The robots I did not implement are Position05, Position06, Boom03, Boom04 and Follow03. I found that they were a bit harder than the other robots and because I got a late start I just didn't finish them. I still think I learned something about the robot behaviors such as how to move, locate other robots, and follow or shoot at them. However, I also thought some things were a bit hard to understand. I was surprised at how the gun, radar, and tank interacted. Also, the concept of "turns" during a battle is still a little confusing to me.

I think in creating a competitive robot I will definitely have to first attempt the rest of the basic robots. I think they would be a good foundation upon which to build more advanced robots. This is the first time I've ever heard of the term "kata", however I definitely think they are useful in software engineering. The way they require you to perfect a basic technique as a means to apply it to more advanced techniques is really helpful and directly applies to programming in a lot of ways.

With that in mind, I think my competitive robot design will be based on constant movement and trying to shoot where I think the other robot will be. Whether or not I have the ability to implement those designs is another story...

Wednesday, August 31, 2011

Chapter 2: The Infamous FizzBuzz

The FizzBuzz program is a problem that is used by interviewers in software development to determine whether an applicant can actually program. The reason it exists and is used in interviews is due to a belief that is held by many software engineering professionals: most so-called programmers can't write code at all.

So, as a software engineering student I guess it's a rite of passage that I provide my version of the FizzBuzz:

In the first implementation, I was simply trying to complete the requirements and see that the output was correct. After I started up Eclipse and created the project files, it took me approximately two minutes and thirty seconds to complete the program and verify the output. Because we had completed the problem in class a few days before, the completed code did not take very long at all.

In the second implementation I altered the code to make it more modular and organized. This practice lends itself to creating bigger projects with more lines of code. From start to finish this implementation took a little longer at about four minutes, but I think it is a more useful implementation than the first.

This exercise, though short, made me think about software engineering and how time is spent coding. Good code that scales well will save time in the long run though takes more time initially to write. However, bad code that is quicker to write initially could cost you a lot more time later on.

My hope for the future is that I will make it a habit to strive to write better code that is easier to read and understand, rather than simply find the quickest solution. Though I think I am still learning the basics of programming, I think this approach will pay off in the long run.

Sunday, August 28, 2011

Three Prime Directives: iPod xTract

The Three Prime Directives for open-source software engineering are goals for developers to ensure that their system is useful, easy to install, and easy (for other developers) to understand. Furthermore, only an external user can verify that a system meets the Three Prime Directives. In my quest to learn more about how to achieve these goals I decided to look at an open-source program called iPod xTract.

One of the main gripes I've always had with the iPod over the years is the lack of an easy way to extract songs from an iPod to a computer. I have always assumed the reason for this is to prevent the sharing of songs amongst users. However, I've also been unfortunate enough to have lost my entire music library once due to a failed hard drive, so I recognize the need for users to be able to retrieve their songs as well. iPod xTract claims to accomplish this task by allowing users to search their iPod and save the songs they want to a local computer quickly and easily.

Prime Directive #1: The system successfully accomplishes a useful task.

In short, iPod xTract does exactly what it claims to do. I was able to access my iPod, search for a song, and save it to my computer quickly and easily. It's a simple program that does its job well. The developer mentions that he began the project because he was unsatisfied with similar programs that were slow due to the indexing and displaying of the song library in its entirety. iPod xTract is much quicker because it does not retrieve anything until the user hits the search button. Overall, it accomplishes its task successfully.

Prime Directive #2: An external user can successfully install and use the system.

I downloaded iPod xTract as a .jar file which I was able to run on Mac OS X, and I believe most external users should be able to download and install the system fairly easily. Once the program started, I was able to select my iPod and quickly search for a song to extract.


The user interface was simple and easy to use. Searching was made easier with parameters such as title, artist, album, year and genre. From there, it was as easy as hitting the 'Extract Song' button to save the song to my local computer.


Songs were downloaded almost instantly to their specified directory and I was able to play them in iTunes and Winamp. I found iPod xTract to be so useful and painless that I will likely use it again in the future and recommend it to others as a useful tool.

Prime Directive #3: An external developer can successfully understand and enhance the system.

The source code for iPod xTract was formatted in a way which was easy to read, and there was a decent amount of comments to aid other developers in understanding the code. There were also javadoc comments throughout the code, so other developers could get an overview of the different methods, classes, and other components which make up the program. One thing I did not see was a basic overview, or explanation of how the system works for prospective developers looking to enhance the program. While the current level of documentation might be acceptable, it would help other developers to have this overview available. Had this been provided I believe iPod xTract would have completely satisfied prime directive #3. I think if I had time to become familiar with the code I would be able to at least modify the system or attempt to improve it.