Archives

You are currently viewing archive for May 2007
Category: General
Posted by: afeldstein
I don't know why CMP Media LLC sends me VARBusiness magazine. I've tried to explain to them that Cosmic Horizon is not a value-added reseller.

Anyway, I glanced at the 2007-05-14 issue and saw that it contained some good advice from David Russell for companies in hiring mode. Let me share a couple of quotations with you:

  • "If I had $20 for every time a [company] told me they thought someone was a good hire but that they didn't work out, I could fund an IRA."
  • "... hire great people rather than hiring convincing interviewers ..."


Does this sound familiar?

I would add another item to that list:

  • Take a close look at your initial screening process and the people who implement that process. So-called "Technical" Recruiters, with some exceptions, are generally less skilled than they were in years past. I call the worst ones "Keyword Counters". You want them to get the list of candidates down to a manageable size, but don't let them eliminate your best candidate before the telephone interview.


A "Technical" Recruiter once eliminated me because "CPU" does not appear on my resume. My colleagues and I had a good laugh over that one, but I feel sorry for the client.

I am available, by the way.
Category: General
Posted by: afeldstein
On 2007-05-21, IBM launched its first POWER6 server, the IBM System p 570. I am genuinely proud of this achievement, having worked on the POWER6 microprocessor myself. I and the innovative test programs I developed for IBM in 2004 and 2005 made significant pre-silicon bug discoveries, particularly in the core. Customers purchasing the IBM System p 570 will not experience any of those bugs.

In the press release, Bill Zeitler recognized the "relentless innovation" employed in the POWER6 microprocessor design project.

The press release does raise several questions and comments, however:

  • "The processor speed of the POWER6 chip is nearly three times faster than the latest HP Itanium processor that runs HP’s server line."

    On what is that claim based?

  • "SPECfp2006 (measuring floating point-calculating throughput required for scientific applications)"

    SPECfp2006 does not measure throughput. It measures speed.

  • "IBM System p570 1-core (4.7 GHz, 1 chip, 2 cores/chip,1 thread/core) SPECfp2006 result of 22.3"

    SPECfp2006 is a peak metric. What was the base result?

  • All SPEC CFP2006 Results Published by SPEC

    As of 2007-05-23, the results are missing. When will they be published?

  • "In cases where an over-temperature condition is detected, the POWER6 chip can reduce the rate of instruction execution to remain within an acceptable, user-defined temperature envelope."

    Does SPEC CFP2006 stress the system long enough to cause the over-temperature condition to be reached? If not, a real customer workload might, causing actual IBM System p570 performance to be less than that suggested by the benchmark results, relative to competitors' systems that cannot dynamically reduce the rate of instruction execution.
Category: General
Posted by: afeldstein
It appears that I am getting very close to the release of FSS Version_0-006, which introduces automated verification to the SPARC simulator. Instead of just one simulation run, this version pulls one test program after another from a test cases database, pushing a result for each into a results database.

To enable automated verification, the user will need to provide a test cases database. FSS is robust enough to allow interactive operation if a suitable test cases database is not present. Of course, this means that, with this release, I will need to describe what constitutes a "suitable test cases database". I will get around to that, but not in this blog item.

In house, I am randomly generating assembly language programs, then calling the SPARC assembler and linker present in Solaris. However, FSS is a Java application. One of the benefits of using a (properly written) Java application is that you can run it anywhere. Therefore, I cannot assume that you are running Solaris or that FSS would have access to a SPARC assembler and linker in your environment. That is why my in-house random test program generator is not integrated with FSS. Instead, for automated verification to be available to the user, FSS depends on an external test cases database, which contains SPARC executables.

Let me take you through some of the highlights of the past few months.

By late November, I was successfully storing results of automated verification in the new results database (at least I thought I was at the time). Those results are useless to you unless you can see them, so I decided to create a separate report generation application. Such a postsimulation application is beyond the scope of the SRS, but like I said, you must have it. And it is compatible with Requirement 1.9.11. Therefore, with Version_0-006 I will be delivering an "application suite", one application per Jar file. The bundling will take place at the Feldstein_SPARC-V9_Simulator_Version_0-006.tar.gz level.

On New Year's Day, the postsimulation tool showed me that FSS had failed to add certain data to the results database. I quickly fixed that. I spent a few days making the results look pretty. On 2007-01-06, I wrote in my notes, "Subject to verification, Version_0-006 appears to be ready for release."

Automated verification showed that it was not ready. I gave FSS what I estimated to be an 8-hour workload, only to discover that a memory leak was preventing me from scaling up.

The Java Monitoring and Management Console (JConsole) tool confirmed the existence of the memory leak, but I already knew that. For memory leak debugging, I needed something more powerful. NetBeans Profiler 5.5 helped me to find the root cause. My ego was spared by the fact that the problem was in JHDL, not in FSS. By 2007-01-15, I had stopped the memory leak. Cosmic Horizon JHDL Version_0-3-45-001 fixes the problem.

Cosmic Horizon JHDL Version_0-3-45-001 also extends the JHDL API, allowing you to create a robust SPARC-V9 implementation in JHDL. Furthermore, if you instead create a poor implementation that allows JHDL to leak, FSS will detect this type of memory leak and indicate a failing design under verification without crashing. FSS is a design verification tool, after all.

By 2007-01-18, I determined that an infrequent assertion failure in Sputnik's integer multiplier was far too frequent to allow 8 hours of uninterrupted automated verification. Although the integer multiplier is not fully activated, it is present in Version_0-006. I would rather fix the assertion failure now than disable or remove the multiplier. I tend to like forward progress. I am stubborn that way.

By 2007-01-20, I was asking myself, "Would it not be nice to be able to look at some waveforms in FSS?" If it would be nice for me, then certainly you will want such a feature too. See Requirement 1.8.2.2.

On 2007-01-25, I became a Sun Certified Programmer for the Java 2 Platform, Standard Edition 5.0 (SCJP). Right after that, I switched development of FSS to Java SE 6.

GUI programming is difficult for me (at least it was). By early May, I was seeing waveforms in FSS. These are generated during simulation, by the way.

On 2007-05-08, FSS's waveform view helped me to solve the assertion failure in Sputnik's integer multiplier.

I then ran what I had estimated to be an 8-hour workload. It completed in 42 minutes! Apparently, I did not have enough data for an accurate extrapolation. Last night, I randomly generated more test programs. Today, I will try again. I want 8 hours without "FSS failure" before I release FSS Version_0-006.

There are just a few items on the to-do list before the next release:

  • After completing the waveform feature, I noticed that I had broken automated verification without waveforms. I have fixed that, but I need to go back and make sure that waveforms still work.

  • Examine the results from the 8-hour workload. I would really like to have 8 hours without "Sputnik failure" before release too.

  • Decide how to deliver test cases database documentation. If I decide to deliver it not packaged with the application suite, then I can work on that right after release.