?

Log in

No account? Create an account
current entries friends' entries archives about me Previous Previous Next Next
Technical Life - cellophane
the story of an invisible girl
renniekins
renniekins
Technical Life
In my "introduce yourself" post, I opened the floor to questions. One question was "I'd be most interested to hear more about is your technical life -- you write a lot about the social aspect of your job, but since I'm a geek I'd also be interested in what tools you're using, how you like them, that kind of thing."

I figured some of the other folks on my fl might be also curious, so here's a quick rundown. Feel free to ask more questions if you'd like.

Hmmm, my technical life. I do java development. I happen to run Windows, but I don't much care for it. I use Eclipse is my IDE and I love it. I use Agile as my style of project management, and I am a big advocate. My current project uses Scrum, a form of agile. We have 3 week "sprints" (aka iterations, in which we have a specific set of tasks (stories) we are working on and attempting to finish within the sprint), and we do two in a row. Every 2 sprints (6 weeks) the whole team travels and gets together for a colocated planning session wherein we spend a week in a room with a whiteboard and plan and design the next 6 weeks worth of work.

There are 30-ish people on my project, and I'm the only woman. There are actually 3 10-person teams, since agile is best done with a small team. We are located in several US states, Belgium, and Germany. Probably 30% the team works out of their houses, the others in offices.

I work at home when not at the colos, sitting at a desk in my den. I do lots of pair-programming (another agile/eXtreme Programming concept), and we use VOIP and VNC to do this. We also use a lot of instant messaging. It works better than I would have expected it to. Most of my projects literally involve two people working on one desktop, both talking/designing/working together, one person "driving" aka typing. This means I'm on a headset talking to one or more coworkers almost all day, even though I'm alone in my house. When I'm not driving, I sometimes have a cat on my lap. We have daily status meetings which last about 10-20 minutes, to keep everybody on the same page and aware of each story's status. A "story" is a unit of work to accomplish.

I have a laptop and a laptop backpack which always holds a spare power cord, a mouse, a headset, an internet cable. (And a spiral notebook for the occasionally scribbled notes. Other than that, my job is completely paper-free.) With this setup I can be completely mobile and work anywhere with internet connectivity. Talk about agile! That said, it's easier to work somewhere without distractions, and it's easier to work somewhere with some ergonomics set up.

Whoops, I got non-technical again I think. I am a big proponent of object oriented programming. I also love test driven design/development. I have gotten very good at the classic TDD method of coding, which is: write a test that fails, get the test to pass, refactor, repeat. It's hard to get into that mindset, but when you do it helps immensely with design and test coverage.

I also *love* to refactor, reducing redundancy and complexity, making the code clean and readable. Eclipse has lots of refactoring tools that make it easy to write code extremely fast, then tidy it up with the tool. I am a keyboard junkie when it comes to coding, and although I can barely remember what I ate for breakfast this morning, I can tell you key combinations for all kinds of frequently-used refactorings and lookups. My memory is a weird thing. (for example, these are a few off the top of my head. Classes that implement a method? control-t. Classes that use a method? control-shift-g. Open a resource? control-shift-r. However alt-shift-r will rename, and alt-shift-m is method extract. Make getters and setters? alt-s-r. And if you can't remember a refactoring combination, alt-shift-t will pop up a selection window. And yet, I still don't remember the names of everybody on my team yet, and I spaced out while driving somewhere familiar last night and made two wrong turns.....)

Okay. That's probably more than enough geekiness for now. Feel free to ask questions or request elaborations though!

Tags:

read 16 comments | talk to me!
Comments
marsgov From: marsgov Date: March 31st, 2008 01:37 am (UTC) (Link)
What you need is a GPS unit with keyshift combinations. "Honk Honk Left-turn flash brights" takes you to the supermarket, for example.
renniekins From: renniekins Date: March 31st, 2008 01:41 am (UTC) (Link)
hahahahaha!!! I love it!
pstscrpt From: pstscrpt Date: March 31st, 2008 02:51 am (UTC) (Link)
I use Eclipse is my IDE and I love it.
I generally used Eclipse for my school projects that used Java and I generally liked it. A comparison of Eclipse with JDeveloper (that I had to use at work) is the biggest thing that makes me suspect that J2EE development in general isn't nearly as bad as the J2EE development with Oracle tools that I was doing.

I work in a .Net shop now, but we use Oracle for the database, and most of us use SQL Developer as the front end (the older guys who've been using Oracle for 20 years use SQL*Plus). Using that helps highlight that Oracle is just plain awful at working with Java.

--------------------------------------------------------------------------
And I'm sold on the general idea of test-driven development. However, I'm still very suspicious of the sort of design that requires 10 interfaces and 20 classes to do jobs that would be two pages of basic procedural code, and that seems to be necessary to support the tests under Java/.Net style languages with current tools. It seems to me that conditional compilation could be a lot simpler (maybe just basic #IFDEF stuff, maybe a heavy-duty macro system, or maybe just having the test system fill in alternate versions of a class).
kyril From: kyril Date: March 31st, 2008 03:19 am (UTC) (Link)
We just switched from Sybase to Oracle (don't do it if you have the option, in either direction! If you have to do it, isolate every single bit of your SQL into a separately testable thing, have folks who understand the application test those things, then think about starting the conversion...there's gotchas going back and forth!) and while we nominally have DBA support we didn't have before, in practice Google and I are still the actual application DBAs. And while TOAD has some nice views (I like the session browser), if it doesn't have what you want (e.g. sort sessions by total CPU time or most recent call) you have to google for it and use SQL anyway...

On the other hand, having started 8 years ago on this application as a big OO proponent, I discovered a lot of nice OO concepts (e.g. double dispatching, inheritance in general) are not useful or easy to misuse much of the time. Of course, other ideas like information hiding/encapsulation are much more broadly applicable and don't need OO to do. Proliferations of classes and actors are bad. Inheritance where it isn't useful is bad. Sometimes the data just needs to be dumb data, or structures, and sometimes a piece of the application needs to be taken out of the OO level (even if you have classes to load and persist the records and handle business behaviors) and treated as a data flow application and done in SQL or awk some such. On the other hand, if your database has any qualms about parallelizing, if it's heavy enough to convert to SQL you may eventually wish you had stayed with the OO code and could run 48 of them simultaneously.

Having used JUnit seriously only on a couple pieces, I can say making code testable does cause certain "funny" design decisions...plus pretty much every private method becomes protected for me as I need to make at least one "testable" subclass that fakes out the parts I'm not trying to use in any given test. It's kind of a pain. I wish I had (or we were willing to add on) a macro language for all of our Java, awk, ESP (batch scheduler) etc. code, if only to make development vs. test vs. production environment customizations easy.

(Why would anyone want to make stuff private when it could be protected? I usually don't get that...unless you're making classes to sell without the source code or something.)
pstscrpt From: pstscrpt Date: March 31st, 2008 03:45 am (UTC) (Link)
I expect Sybase to SQL Server would've been a lot less work. You have to be willing to keep Windows servers around, though. And I tend to favor architectures where the database is the heart of the system (ideally to the point of banning business logic that isn't in the database), but one of those means you're never going to change the back-end.

I discovered a lot of nice OO concepts (e.g. double dispatching, inheritance in general)
I've had some tasks where inheritance all over the place seems appropriate and some where I'll extend maybe one class in a project. It mostly seems to depend on how much I end up modeling the problem.

One thing I think I have decided is that if you're using a class, it should be none of your business what it extends. Polymorphism should come from interfaces and code reuse should come from inheritance (or macros). Mixing and matching interfaces and derived classes like .Net and Jave do is just a conceptual confusion.

kyril From: kyril Date: April 1st, 2008 12:40 pm (UTC) (Link)
I can't imagine ditching Sybase for SQL Server. Certainly not on the large HP-UX boxes our application runs on, but even for a smaller application that was always on a wintel platform I'd really want serious cost justification before I even considered it. Oracle has gotchas Sybase doesn't (or maybe just different gotchas), some things with it are faster...but if you "talk to it wrong" some things are much slower, even if your queries are perfectly indexed.

Part of the art of programming is deciding where to partition the application and put in opaque interfaces, and where boundaries are unnecessary because "it's all one thing". Hard and fast rules are dumb...unless some rules are harder and faster than others.
pstscrpt From: pstscrpt Date: April 1st, 2008 01:09 pm (UTC) (Link)
I'd really want serious cost justification before I even considered it.
I figured you were changing because you couldn't find Sybase people, anymore. Maybe not.

if you "talk to it wrong" some things are much slower
That's probably true of any of them, but I have particularly noticed that the Oracle optimizer really doesn't seem to be very bright. I'm mostly comparing Oracle 9i to SQL Server 2000, though, and that might not be entirely fair.

Hard and fast rules are dumb
Hard and fast rules in general are dumb, as every problem is different. Hard and fast rules within one system, though, tell you where to look when you have trouble.
kyril From: kyril Date: April 2nd, 2008 06:41 am (UTC) (Link)
The optimization is somewhat different: in Oracle, you want to prepare statements or use stored procedures wherever possible, using BEGIN/END; makes parsing slow beyond all sanity and even "soft parses" (all the literature implies if it's not a "hard parse" it's not too bad) take 90+% of the time a hard parse takes.

The empty string turns into NULL, and it's not true in Oracle that NULL = NULL, nor NULL != NULL, though NULL IS NULL is true. So any time you might have empty strings or NULL values, you have to add a separate "is NULL" case. (In Sybase, '' = ' ' and '' is not null; null = null, and not (null <> null) is also true.)

Usually it doesn't pick stupid query plans if your statistics are vaguely up to date. Usually. But there's subtle things that make it stupid. And even if you have two tables ordered by indexes on the key you join them on, sometimes it still likes to hash the tables (equivalent of Sybase REFORMATTING if quicker) rather than doing nested loops.

Oracle Index Organized Tables seem very much like Sybase Clustered Indexes, except Oracle proponents say they're totally different. And they say Oracle's "clusters" are a third totally different thing...but they don't really look it, from my perspective.

Sybase and Oracle like to miss an index and convert the indexed columns to your literal's type rather than convert your literal to the type supported by the index. But at least in Sybase you can specify a BINARY literal.

Oracle 10G parallelizes (esp. the select part of multi-table insert/select) better than Sybase 12.5 but maybe not better than the current Sybase 15.

The list goes on...

...and no we didn't run out of Sybase people, we thought we had more Oracle support than what little we got. And a premium was placed on using Company Standard Products, plus Oracle goes on someone else's budget but Sybase was on ours. They won't even let us use SQL Server unless the application is internal and puny.
pstscrpt From: pstscrpt Date: April 2nd, 2008 01:18 pm (UTC) (Link)
stored procedures wherever possible
75% of my work since June has been PL/SQL programming, so I've got that part covered. Would the hard parse be where it does the hash comparison to see if it already has that SQL compiled?

it's not true in Oracle that NULL = NULL
SQL Server is descended from Sybase, but they corrected the "Null = Null" thing in SQL Server 7 (the version before 2000), and it returns Null, like it's supposed to. I was just getting started in databases then, and was surprised to learn that, because I didn't know you could successfully compare Null to Null. Anyway, I'm kinda surprised Sybase hasn't changed that, yet.

People have been complaining about Oracle treating empty strings as Null for many years. I hear rumors they're going to fix it, but nothing concrete.


But there's subtle things that make it stupid.
Yup. One thing I've learned is that, even though a WHERE EXISTS may seem like it's saying exactly what I want and ought to be more efficient, it's likely to have 300 times the execution cost of a derived table equivalent. I'm not really sure, though, if the optimizer really does just have more shortcomings, or it's that I had eight years of experience learning what SQL Server likes.


Sybase and Oracle like to miss an index and convert the indexed columns to your literal's type rather than convert your literal to the type supported by the index.
SQL Server does that, too. You can have a VarChar column and do a select "Where MyColumn = 3", and it will work until somebody puts in a value that can't be interpreted as a number.


And a premium was placed on using Company Standard Products
That's how we wound up using Oracle's Java tools at Talk America (after they bought out LDMI, where I used to work). The idea was that we could just call up Oracle for help, no matter where in the stack a problem occurred. I think that ended when Talk America was bought by Cavelier, though.
specialagentm From: specialagentm Date: March 31st, 2008 04:44 am (UTC) (Link)
I'm not sure why TDD would have to be linked to any sort of heavyweight design. The one thing that interfaces (as a general concept -- separate what you want to do from who does it and how they do it) helps with TDD is that you can freely pull out the parts of the app you don't want to test in a unit test and use "mock" objects. Then you test the real code of one piece, and fake out the rest so that the real code doesn't care, and you can just interrogate/instrument that.

TDD is wonderful. It gives you a confidence in your code that really does help in speeding up development. If you pass the tests, at least you know within the scope of what you're testing, nothing broke. It's been very rare that I've seen a test break and found out it was the test that was wrong, not my code. Most of the time, running the JUnit tests and seeing all green gets me revved up to dig in and code some more.

Oh, by the way, your ideas of test systems filling in alternate versions -- some frameworks do support that. They'll write mock objects for you, or provide an easy way to produce those. I haven't found those useful to my work yet, but they're out there.
pstscrpt From: pstscrpt Date: April 1st, 2008 01:12 pm (UTC) (Link)
The one thing that interfaces... helps with TDD is that you can freely pull out the parts of the app you don't want to test in a unit test and use "mock" objects.
You say "helps" like there's another way to do it. Is there?
renniekins From: renniekins Date: April 1st, 2008 01:20 pm (UTC) (Link)
You can subclass something you want to mock, overriding methods. You can subclass the class being tested, overriding methods that call the class you don't want to touch.

Oh and EasyMock is a cool tool for building mocks out of interfaces very rapidly with little code.
pstscrpt From: pstscrpt Date: April 1st, 2008 01:36 pm (UTC) (Link)
That's the same idea, though, isn't it? And you'd still need to design your system to use factories everywhere to supply the mock versions when appropriate. Unless maybe the constructors looked for a test flag, but then you're having the main class server as a mock...

I dunno. TDD is probably worth all that when you get the hang of it (don't get me wrong, I totally understand the benefits). It just seems like the accepted best practices are getting to be seriously at odds with the fundamental natures of the languages. Maybe when you declare an object reference it should have to be typed as an interface and not a class. Maybe New() should be a static method on the class that can see what test you're running (annotations?) and doesn't have to return the type you said. Maybe first-class functions could be a cleaner way of providing test functionality than programming to interfaces...
specialagentm From: specialagentm Date: April 1st, 2008 01:57 pm (UTC) (Link)
One definition of subtypes in OO [1] is that you should be able to freely substitute a subtype for any instance of its supertype, and the caller should not know the difference. Different behaviors may result, but the message-passing, flow of control, and correctness is identical.

All that factories or dependency injection does is pull that up one more level and let you make a decision to, application-wide, swap in and out different implementations without having to change every single call to new(). Some very dynamic languages like Lisp or Ruby let you do this directly (I can, at runtime, redefine class Foo to be some other piece of code entirely). Less smart languages like Java need a little help, but you can write a factory class or static factory method pretty quickly, and a global search and replace to change:

Foo bar = new Foo();

to this:

Foo bar = FooFactory.fetchMeAFoo();

is pretty easy too.

So, yes... upfront work is needed. But this doesn't just drive TDD, it drives a lot of other pieces of flexibility that are often useful. For us, use of factories means we can write core code that doesn't care where it's running at, and how it gets to the data it needs -- in some cases, it has local database access. In other cases, it has to fetch the data over a Web Service to someone else who has DB access. In the TDD case, it's using fake in-memory data.

I'm not trying to sell anything, just trying to show the full picture of why these practices are followed. I'll admit that the way you implement these concepts could be more elegant, but the dirty bits are usually all quite well hidden once you've got it all setup.

[1] http://en.wikipedia.org/wiki/Liskov_substitution_principle
specialagentm From: specialagentm Date: April 1st, 2008 01:40 pm (UTC) (Link)
No, I can't imagine TDD without some sort of implementation-hiding, and using interfaces or writing TDD-centric subclasses seems like the simplest route.

If you can't change the run-time behaviors of the code chunk, you need to mock out some other piece of the architecture -- so if you can't isolate a database dependency (and use an in-memory data set or something), maybe what you can do is point to a local database that you can reload with fake data suitable to each test.

To me, interfaces / subclasses aren't overhead -- they fall out of the object design anyway, I already had lots of other reasons to want to have that sort of flexibility. I try not to go overhead, though -- usually, other aspects of agile (constant refactoring, pairing) helps me adjust the object model to hit that sweet spot of "just right".
specialagentm From: specialagentm Date: March 31st, 2008 04:50 am (UTC) (Link)
We've just started using Scrum formally (just started our second Sprint). We're doing 4 week sprints, and I think we're going to plan in 1 week "catch a breath" sessions in between, at least for a while (our backlog isn't what I'd call mature yet). I Scrum-mastered the first Sprint, that was way more taxing than I expected. We all labeled the process as: "intense".

We're not nearly as spread out as your team, but I do have folks in Boulder and Florida to interface in, so we've used a lot of screen sharing and telephony (which happens to be VOIP, but using Cisco IP Phones so it might as well be POTS). We're finding remote pairing to not be all that big of a hassle -- it's been refreshingly easy.

We wrote a few custom apps to handle our Sprint'ing -- a web-enabled task board with virtual sticky-notes so we can move things around as we're doing the daily Scrum, lots of use of our internal wiki; overall, it's gone very smoothly. Hooray for agility!

We just put a continuous testing system in place; we've been doing continuous builds for a while. Again, I can't even remember how we made do without these. I feel like a real software developer once again :-D

Are you guys doing Sprint demos?
read 16 comments | talk to me!