Democratic Underground Latest Greatest Lobby Journals Search Options Help Login
Google

CA State Sen. Debra Bowen Finally Grills 2 ITAs. (LONG transcript) Part 1

Printer-friendly format Printer-friendly format
Printer-friendly format Email this thread to a friend
Printer-friendly format Bookmark this thread
This topic is archived.
Home » Discuss » Topic Forums » Election Reform Donate to DU
 
nicknameless Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 05:13 AM
Original message
CA State Sen. Debra Bowen Finally Grills 2 ITAs. (LONG transcript) Part 1
My apologies to the weak of stomach, but the transcript at this point can only be found on BBV.org
There are additional comments by bloggers there, but I'm just posting the transcript.
The third ITA, Ciber (the one that certified Diebold for use in CA) refused to show up.
People from Wyle Labs apparently weren't under oath.

http://www.bbvforums.org/cgi-bin/forums/show.cgi?tpc=1954&post=19356#POST19356

This hearing lasted approximately 3.5 hours, and due to length, we are publishing the material 30 minutes at a time.

Very problematic testimony appears about halfway through the full transcript in response to skilled questioning by Senator Bowen. For those who are involved in the voting machine issue, you'll want to read the whole transcript.

The plodding pace and self-aggrandizing presentations of the ITAs may annoy you in the first hour of testimony. But read carefully. In these early sections, the ITAs paint themselves into corners that prove difficult to get out of later when they are subjected to questioning.

TRANSCRIPT: CALIF. SENATE ELECTIONS COMMITTEE HEARING MAR. 29 2006

First 30 minutes

Senator Debra Bowen: Good morning, thank you for joining us today. This is the third hearing that I've held to hear from voting machine manufacturers about certification processes and issues and it is the third time that Diebold, ES&S, Sequoia and Hart Intercivic have declined to appear. Three of these vendors have asked if they could appear at a later date and we will try to accommodate that request.

The fourth, and I will let you guess as to which one that is, has simply refused to entertain any idea of appearing before the committee so we will begin to look at alternative ways to compel their appearance.

This is the second time that I have invited the three major Independent Testing Authorities, which we refer to as the ITAs, to appear, and I would like to thank the representatives of Systest and – are you "Wyl-ee" or "Wyle", I hear both.

Wyle (Jim Neu): "Wyl-ee".

Senator Debra Bowen: "Wyl-ee." We'll have it right for the record now. I'd like to thank you for volunteering to appear before the committee.

The third ITA, Ciber, has once again declined to appear. I received a letter yesterday from Ciber's CEO noting that although the company has 8,000 employees in 17 countries, there are only a handful of employees that work on voting systems and the company isn't in a position to go from state to state on hearings.

So as with voting machine vendors, the Committee will be looking at ways to compel Ciber's appearance before the Committee since attempting to find a date that works has not been successful. So that is enough about who isn't here.

One other word, this Committee has been looking at voting as an end to end process, starting with the principle that every eligible voter be registered, then moving to the principle that every registered voter have a convenient way to vote, and finally that every voter who has voted have their vote count as it was cast. Much of our focus has been on the third.

We have recently become aware that we have issues in California with the registration of voters and that we have a new system under Help America Vote Act that has resulted in 43 percent of all new voter registrations in Los Angeles County since January 1st being rejected by the secretary of state's CalVoter database. These voters are in a status that is referred to as a "fatal pend" and unless registrars can contact them personally they will not receive a sample ballot, a permanent absentee ballot or otherwise be informed about an upcoming election, and this includes the April 11 municipal elections and special elections in the 50th Congressional District. So we will shortly be turning our attention to database issues rather than the software and hardware specifically on which the database is run.

But for today we're going to focus on how voting systems are certified. Under the law before any voting system is certified for use it must be approved by one of the ITAs. These ITAs are not state or federal agencies, I think this is something that's not clear to many people how this works, they are private companies who are qualified to do this work and who are hired by the vendor of the voting machine systems.

So a number of questions have been raised about how the current system operates and I don't think we have to look much further than what happened with the Diebold TSx to wonder about how this process works.

The secretary of state's test of the Diebold memory cards which we knew about only after the machines had been recertified for use identified 16 security flaws that had not been uncovered during the ITA process. Actually Ciber which had been hired by Diebold to test these memory cards had found three security flaws.

That has led many people to be concerned about the testing process. What is possible to discover, what the limitations are inherently of any process for testing hardware or software, and ultimately what systems, processes, audits should be in place in order to deal with the inherent limitations of testing of any software or hardware device.

And for anyone who thinks that perfection in either hardware or software is possible I invite you to go to your internet browser's patch page and have a review of the critical security updates and other patches that are a routine part of the world of software, and as to hardware, the electronic waste dumps of this country have become littered, not just the Osbornes, Commodores, and TRS-100s that many of us in this room had in the late 70s and early 80s but also with devices that had chips that failed, that overheated that had a variety of other problems and ultimately didn't make the market test.

I do appreciate, I really appreciate the time that Jim Neu and Joe Hazeltine from Wyle Labs and Brian Phillips from Systest have taken. I'd like to start with Wyle, but I do first want to ask you Mr. Phillips, how was that Avalanche game last night?



Senator Debra Bowen: So let's begin with Wyle Labs, I don't know who wants to – do you have a PowerPoint?

Wyle (Jim Neu): No, I don't, I just have a statement I'd like to read.



Wyle (Jim Neu): Good morning. My name is Jim Neu, I'm the senior Vice President and General Manager of the test engineering and research group at Wyle Laboratories. With me is Joseph Hazeltine, senior division director for the aerospace, DoD and commercial test organization at our Eastern test division.

We're pleased to be here and welcome the opportunity to explain our role in testing of voting machines. Wyle Laboratories was founded 57 years ago as the first independent testing lab for systems and components under harsh environments, including dynamic and climatic extremes. This includes simulating vibrations experienced in aircraft, rockets, railroad cars and other vehicles, it includes seismic, earthquake simulation, and an array of climatic conditions such as high and low temperature, humidity, airborn salt, sand and dust.

As critical civil, military and space programs progressed during the 1940s and 50s, the demand for simulations in a laboratory setting increased. Today Wyle is the world's leading environmental simulation laboratory with nearly 3,000 employees. We're engaged in test and evaluation activities across the U.S.

Products tested by Wyle include fuel valves for the space shuttle, most major aircraft components, and many consumer products. Wyle provides testing services to the aircraft, military, space, communications, transportation and power industries. We maintain expertise in critical technical areas to ensure that we can always provide realistic simulation of the environment in which a product will function and that we can take accurate and objective measures on how the product operates in the specified environment.

Wyle became involved in the testing and certification of electronic voting systems in the early 1990s. Wyle was the first company to obtain accreditation by the National Association of State Election Directors, or NASED. It must be noted that Wyle does not certify or approve the voting system for use. Our work is simply to test the product in accordance with the required standards and to document the results. We do not control the listing number of the product, and are not the final authority on the acceptability of the system.

It should also be noted that shortly after our accreditation as an Independent Testing Authority, by NASED, we limited our involvement to the precinct level voting system, that is, the machine that a human would interface with during an election, and the firmware that would be resident in that system.

Wyle does not perform software evaluations at the central count level and does not evaluate election management system software.

The standards we utilize include the 1990 performance and test standards for punch card, mark sense and direct voting electronic voting systems, the 2002 Voting System Standards, and the 2005 Voluntary Voting Systems Guidelines, Volumes I and II. This VVSG is in the grandfathering period and will become applicable in 2007.

Since the early 1990s, Wyle has tested over 140 separate voting systems. The process that we follow in the performance of a certification program is outlined roughly as follows: The first phase, pre-test activities. Initial contact generally from a potential customer is to request a quotation for testing services.

Generally we quote the following, generally the following tests are required. First is a review of the Technical Data Package. Then electrical testing, which we test for, we quote the test for power disturbance, electromagnetic radiation, electrostatic disruption, electromagnetic susceptibility, electrical fast-transient, lightening surge, conducted RF immunity and magnetic fields immunity. Then we quote the environmental testing portion, high temperatures, low temperatures, humidity, vibration, bench handling, temperature, power variations, product safety, functional testing – I'm sorry, other areas then that are quoted are product safety, functional testing, a source code review and final report.

Phase 2 is the qualification testing itself. I'll start with hardware testing, it's typically performed on multiple units concurrently. Preferably one unit will begin humidity testing, which is a 10-day test, and another will begin the remaining non-operating environmental tests, such as low temperature, high temperature, vibration and bench handling.

Prior to and immediately following each test, a functional test is performed on the equipment under test to verify that it's still operating.

For all electrical tests and temperature and power variation tests, the equipment is placed in an active mode in which the ballots are continually processed. During test performance the equipment must remain operating.

Product safety testing is usually performed in parallel with environmental and electrical tests.

During any part of qualification testing, interim reports are issued to the customer informing them of the test status, findings to date and any other information deemed pertinent to the test program.

We then do a Technical Data Package review, the customer's documentation and their data package is reviewed to evaluate the extent to which it conforms to the requirements of their own configuration and their quality assurance practices, and it is additionally reviewed to confirm that it corresponds to the actual configuration and operation of the system that we're testing.

TDP review is performed in parallel with hardware testing and the functional testing.

Next, the functional testing, the voting system is functionally tested to address the following activities: Overall system capabilities, which consist of security – that is, evaluating the access control aspects of the system including the locks, passwords, and pollworker activities. The accuracy, that is, recording the appropriate options for casting votes, recording the election contests as defined by elections officials and recording each vote as indicated by the voter. Error recovery, that is, the ability to recover from a non-catastrophic failure of a device. Integrity, that is to ensure the physical stability and function of the vote recording and the counting process. The system auditability, that is, preparation of activity logs. The election management system, that is generation of ballots, pre and post-election activities. The accessibility which is compliance with disability requirements. Vote tabulating, system status reports and maintaining proper records, the ballot counters, that is operation of the device that counts the ballots, and data retention, that is preservation of the counted votes.

We then move to pre-voting functions. A majority of these tests are done by the software ITA and therefore not conducted by Wyle.

Then we move to the voting functions themselves. Each supported election type is verified. Opening polls, casting ballots, activating ballots and augmenting the election counter. Post voting functions, closing the polls, consolidating the vote data, and producing reports.

Finally the functional testing is done on systems maintenance and on transportation and storage.

Functional testing is performed as the last step of the qualification testing. During functional testing, problems may be identified that need correcting before testing can resume. In that case, changes must be made by our customer and then regression testing is performed by us if needed. The functional qualification test matrix which is typically Attachment A in the final report, is also completed during this time. The customer is as I said normally on site during functional testing and again, just to reiterate, software and system end to end testing is performed by others.

The final phase then is qualification report issuance and post-test activities. We prepare a preliminary test report documenting the tests performed, the results, any anomalies encountered during testing, and the resolution of the anomalies. We issue preliminary test reports to the EAC committee and to the software ITA for review. Upon approval by the EAC of the preliminary report, we then issue a final report, and then we gather and archive all the raw data.

I've been asked to speak about client relationships. Let me say first, Wyle has a varied client base, for instance we provide testing services for the original manufacturers of various components to ensure that their items meet the requirements that their customers demand, or a client might be the buyer of a product who wants to ensure that specifications, that his specifications have been met by the manufacturer, or the client may be one of the many government agencies who exist to ensure compliance with regulations at all levels.

Although most testing is done in response to some type of regulation, we perform many tests that are performed solely to support a manufacturer's reliability improvement goals, or to simply reduce the risk of product failures and associated cost. I would say particularly in space-related applications, we do a great deal of testing for the manufacturer just so he can be certain that that satellite will survive many years in space.

It is important to note that while Wyle often participates in the development of testing specifications and standards with the government agencies or industry committees, Wyle is not in the business of developing test requirements for its customers. Our function is to test, analyze and report the factual results of our testing as a trusted agent to our clients.

Wyle takes great pride in the fact that its reports are universally accepted as sound, factual, reliable and unbiased. Wyle makes it a priority to avoid conflict of interest in its activities. We have no business or financial interest in any product that we test. As an example, many start-up companies have approached us to request that we take a financial interest in their business in exchange for testing services, but we decline those matters, offers, as a matter of policy. Wyle employees are also required to meet strict ethics and conflict of interest rules as a condition of their employment.

The press has stated that Wyle works in secrecy, suggesting some sort of a clandestine or inappropriate operation. The fact is that Wyle is bound by policy and ethics to respect the privacy of its clients. We view the relationship between an independent testing laboratory and its clients as similar to that as between a lawyer and client, or between doctor and patient. As a matter of policy, Wyle does not discuss any client by name, nor will we release any test information or data without written consent from the client. I need to underscore that this is not a practice simply for voting machines, but for every product that we test across the entire the spectrum. Simply stated, the test data belong to the client, and they are not ours to share with third parties. This is standard practice for which all of the independent testing laboratories in the United States which provide a critical infrastructure for our healthcare system, manufacturing sector, and the economy in general.

It's not uncommon however, for test results from an independent source like ours to be made available to the ultimate buyer of the product. For example, in nearly all cases of environmental qualification testing for the military, the end customer requires evidence of qualification in the form of a report which the equipment developer must provide along with his or her product.

Been asked to comment on our relationship with NASED and the EAC. In the case of voting machines, we are an independent testing authority, operating under the auspices of the National Association of State Elections Directors and the Election Assistance Commission, which sets the testing standards and requirements under which Wyle operates. Wyle's role is to provide testing and evaluation services as set forth in the voting system standards.

NASED and EAC's role is to review our work, and the work of the software evaluators and then to approve or certify the system when it meets all requirements. Wyle is not involved in that part of the process.

Wyle Laboratories and its management and employees work diligently to provide a vital service to government and industry by providing the best possible range and quality of testing services including protection of the rights to data that belong to our clients. Wyle also understands the importance voting machines to our elections process which must always operate with transparency to foster confidence in our results.

I want to thank you for allowing us to present this testimony and we'll do our best, at the right time here to respond to questions within limits of protecting our clients' rights to confidentiality.

Senator Debra Bowen: All right, thank you. Let me just follow up with a few questions that are only intended to help people listening understand some of the terminology.

Wyle (Jim Neu): Yes.

Senator Debra Bowen: All of this, when we get into these technical areas, one of the biggest challenges that we have is to keep people's eyes from glazing over...

Wyle (Jim Neu): I understand, we always, very quickly, devolve into technical discussions when we wind up talking about the things that what we do.

Senator Debra Bowen: Right. So what we'll try to do is to just get some of the basics a little further fleshed out. For example, as you were talking about functional testing. I think the easiest part of the testing that you've described for people to understand is the electrical testing and the environmental testing, just because we all know that when we have an electrical device, anything that has a power cord in all likelihood it's gone to underwriter's labs, and there's been some kind of test that reduces the likelihood not to absolute zero, but pretty close to it, that we'll be electrocuted just because we plug something in. So electrical testing, even though we don't understand how it works, we all understand the concept. Similarly environmental testing, we all interact with humidity, heat, high and low temperatures, we know that we personally function better or less well in certain environmental conditions.

But when you talk about the functional testing you make a reference to system maintenance and transportation and storage, and I have no idea what testing would look like in those areas. What does it mean to do functional testing in the categories of system maintenance and transportation and storage.

Wyle (Joe Hazeltine): Well system maintenance would involve those designs of a voting system to allow the pollworkers to set the machine up or to do some type of maintenance, an example, that might be a touchscreen calibration so that when you press the button for the voter on the top left corner it actually would record that vote, the calibrations of the touch-screen.

Wyle (Jim Neu): Move the microphone a little bit closer.

Senator Debra Bowen: That's what we're just checking, we're doing a "microphone calibration."

Wyle (Joe Hazeltine): That would be an example of a system maintenance type of a check that could be done, these are primarily activities which would be done by the pollworkers pre and post-election.

Senator Debra Bowen: And so what does the system, you've got a list of voting functions, opening the polls, casting the ballot, post-voting functions, and so I'm trying to understand how system maintenance, calibration I understand but what other parts that fall under system maintenance are different from what you've laid out in voting functions and post-voting functions.

Wyle (Joe Hazeltine): What I'd like do is refer to the standard if we can, it's going to give us some guidance on that. Physically on page 2-48 of the Voting System Standards Volume I Performance and Standards, and it says "Maintenance Transportation and Storage," I'll just read what it says, and the interpretation on that is something that we have to work with the vendor to make sure that we're complying, but "All systems shall be designed and manufactured so as to facilitate preventive and corrective maintenance, while conforming to the hardware standards described in Section 3," which, you know, immediately follow, and there's a number of pages there, "All vote casting and tallying equipment designed for storage between the election shall function without degradation of capabilities after transit to and from the place of use as demonstrated by meeting performance standards, once again in Section 3, "and function without degradation of capabilities after storage and between elections."

So those are what that section of the work that we're doing refers do is moving it from its storage location to the voting precinct, you know, setting the machine up, making sure that it works correctly and then transport back to its location after the election.

Senator Debra Bowen: But it's my understanding that ITAs don't focus on usability issues. Perhaps I should ask the question more broadly and let you answer it. One of the questions has been, are usability issues addressed in the testing or is that up to the procuring government to address?



Wyle (Joe Hazeltine): To the extent that they're in the standard they're addressed, but we do, an example would be anywhere where it says "don't do this," say, in the instruction manual we do that. Actually going through the manual that comes with the machine, for setting it up or for having it available during an election, and every time it says "don't do this" that's the first thing that we do. Just to find out what happens if you violate the protocol I guess for setting the machine. And then we document what happens.

Senator Debra Bowen: I'm certain that that's very interesting to see.

Wyle (Jim Neu): I think we probably want to add that the standards, while they're in this document, together, the ITAs and the NASED have developed considerably more robust checklists that are much more specific to what we actually look at. And so when, in whatever phase of testing is being done there has been an agreed to checklist between the NASED and the ITAs that allows us to be certain that we are meeting the intent of that standard.

Senator Debra Bowen: That's good, this leads me into a line of questioning that I wanted to pursue, which is how the NASED standards interact with what you do. What is the lab's relationship to the FEC/EAC published standards? Are they the minimum? Is it possible, is it conceivable that a system could pass testing that violates or does not meet EAC standards or is this an absolute minimum?

Wyle (Joe Hazeltine): We would consider that to be the minimum.

Our relationship to the documents, the 1990 document was prepared before NASED went out and approved Independent Testing Authorities, of which Wyle was one of those approved. On the 2002 voting standards which, once again, you know, NASED and the Federal Election Commission were involved, and we were represented on that committee and worked with them, there were a number of areas, of electrical in particular that we were concerned about, things like cell phones, electrostatic discharge, lightning, things which were added into the standard. On the 2005 version what's happened is that the Election Assistance Commission is using the National Institute of Science and Technology at NIST to actually generate that document, we've been at the hearings, we've listened to what they're doing, we're not a voting member but we're quite aware of where things are headed, and they're doing a much improved job in the area of accessibility, some of the issues, human factors, concerns that you've addressed earlier, and certainly that's an area of the standards that needed to be looked at.

Senator Debra Bowen: But what's the relationship of the checklist that you refer to?

Wyle (Joe Hazeltine): The checklist, initially when 1990 came out there was no checklist and in performing the first evaluation we realized that we needed to go line by line through the standard to make sure that we were meeting all the requirements so we generated a checklist and we started using it on the very first project. When the 2002 version came out we once again worked with NASED and this time it was done with all of the ITAs and NASED together agreeing on this 33-page checklist with, I don't know, a thousand whatever items that are actually being checked, giving us guidance to make sure that anybody that evaluates the system evaluates it the same, and based on that you know I would say that that does set the minimum standard for what acceptability would be.

And please be aware that if we get into our process, and we, it's much like doing troubleshooting and fact-finding, if you find something that doesn't work the way it's supposed to you know we're going to pull on that string as much as we need to to find out where it leads. In other words we may set a higher requirement based on the failure that we actually ran into. In that regard the standards would certainly be the floor of the building vs. the ceiling.

Senator Debra Bowen: So every system is tested against that checklist?

Wyle (Joe Hazeltine): The checklist guides us to make sure that we catch everything that's in the document, documents.

Wyle (Jim Neu): If I may add, the only times that we may not test against something within the checklist is if its functionality, stated by the vendor that's not being supported. An example would be the ballot rotation algorithms. If that vendor doesn't, only supports one ballot rotation but there are requirements in the standards for how they support the others we won't test to that because they don't support it, but if they state they support it we run tests against all of the requirements associated with that.

Senator Debra Bowen: Okay. And to what extent do you, is it possible…Well let me--an example is probably clearer. One of the things we saw with the memory cards was when the memory cards were sent back for testing, the questions that were asked were limited. So you didn't have a complete test against all the standards at that point, you basically had the memory cards sent back for a look at particular issues.

And that's what leads this line of questioning. How do we know as members of the public what exactly is being looked at, or is that something that would only happen with software and you wouldn't see that with hardware and firmware?

Wyle (Joe Hazeltine): That memory card was not tested by us, the software evaluation so I really can't speak to the trail there, what the public can rely on, you know, there are documents that guide us and these are the guidance documents that we use.

Senator Debra Bowen: One of the things that struck me about this is that the hardware and firmware are separated in testing from the tests of the software. One of our early experiences in California with a massive IT system failure was in an attempt in 1995 to redo our Department of Motor Vehicles database. The result was the hardware vendor pointing fingers at the software vendor, the software vendor asserting that it was the hardware vendor who was at fault and somewhere in there I think most people concluded that the problem was that the software in question wouldn't run on the hardware in question. How would that kind of problem come to light, or would it, in a system in which software is tested separately from hardware and firmware.

Maybe I should start by asking you to define with what "firmware" means in this context.

Wyle (Joe Hazeltine): Well firmware is machine resident software that's burned into the chip. It doesn't change, it's the operating system if you will that the voting system works to. The pointing fingers, I was in the Navy, and any time something failed it was either the pump or the motor. So the electricians always said "Well it's the pump that's bad" and the mechanics always said "It's the motor" and I was always the referee. In this situation I think it's somewhat easier because the software supplier and the hardware supplier are the same person, the same company. It's not normally coming from two different places, so you're not going to get as much of that finger pointing.

And again, Wyle limited our involvement in voting machines to the machine resident firmware shortly after we were accredited. As we saw the very first election management system and central count system come to us and we looked at the software requirements, we realized that our capabilities were not there and that this would not be serving the public well to get involved in something that we didn't know well. So we went back to NASED formally and said we feel that this is an area of risk and that we want to limit our involvement, and worked with them to have vendors that are able to do that.

Wyle (Jim Neu): And I believe in our case, when we have completed the hardware testing there is still and end-to-end test which is done by the software ITA. I think in Mr. Phillips' case, his organization does all the testing.

Senator Debra Bowen: Well I think what we're going to do is, we didn't give you a chance to do your presentation and we sort of started in questions, but I think the questions are really going to be for both Wyle and Systest. So it's probably best if we back up, hear from you more generally, and then we'll go back to the details.

Systest (Brian Phillips): I have a PowerPoint...

-End of first 30 minutes-

PERMISSION TO REPRINT GRANTED. THIS IS A PUBLIC RECORD, BUT THE TRANSCRIPTION WORK WAS DONE BY BLACK BOX VOTING SINCE THE FORMAL VERSION IS NOT YET AVAILABLE. LINK TO BLACK BOX VOTING http://www.blackboxvoting.org APPRECIATED IF YOU DISTRIBUTE THIS EARLY VERSION.


Bev Harris
Board Administrator
Username: Admin

Post Number: 3929
Registered: 12-2004

Best of Black Box?
Votes: 1 (A keeper?)

Posted on Friday, March 31, 2006 - 04:02 pm:

quote:

just because we all know that when we have an electrical device, anything that has a power cord in all likelihood it's gone to underwriter's labs, and there's been some kind of test that reduces the likelihood not to absolute zero, but pretty close to it, that we'll be electrocuted just because we plug something in.




Wyle's electrical and safety testing



(Photos take of Diebold TSx machines tested by Wyle. One hundred percent of those we examined had a faulty electrical plug, exposing users to live 100 volt bare metal.)

Perhaps Black Box Voting should send Diebold an invoice for our testing services?


Bev Harris
Board Administrator
Username: Admin

Post Number: 3930
Registered: 12-2004

Best of Black Box?
Votes: 2 (A keeper?)

Posted on Friday, March 31, 2006 - 10:36 pm:
Minutes 30-60:

Systest (Brian Phillips): – Systest Labs, we are a small company, started in 1996, I'm the original founder of the company, I should start, my name is Brian Phillips and I'm the President of Systest Labs.

Our sole business function as a company is to provide test engineering, quality assurance and certification services -- certification services are primarily with the voting systems. We offer voting system software and hardware qualification testing.

We were accredited in 2001 to be a software Independent Testing Authority, that would be similar to Ciber, and in 2004 we were accredited as a hardware Independent Testing Authority, which is similar to Wyle.

In addition to our voting systems testing we also provide independent verification and validation. I like the expression that someone once used, it's software testing on steroids. And then we also provide outsourced software test engineering and services.

The IV&V work that we do is primarily in the public sector arena – we do IV&V for state agencies across the country, California, Colorado and so forth.

The outsourced software testing services is focused on commercial industries, telecommunications, banking, finance, anyplace where there's an IT organization developing software that needs extra expertise and support in doing their testing.

We are an official woman-owned company, certified in the state of Colorado and the Small Business Administration because we are a small company of about 70 employees.

Our ITA process, the work that we do for the voting systems industry is governed by the standards that are published by the EAC and NASED, as well as our own processes, methodologies, templates and procedures. Currently we are doing all of our voting systems testing to the April 2002 VSS, Voting System Standards and once we are required to start testing to the VVSG, the 2005 Volunteer Voting System Guidelines we will begin testing to that.

In the interim we are evaluating the vendors' products and giving them a heads up whether or not their product actually would meet the 2005. As I think, as Wyle said there are products that will meet 2002 but may not meet 2005 and we're giving them a heads up so that they will have at least some information knowing where they need to go to make their product changes in order to meet the 2005 standards.

Senator Debra Bowen: And just to be clear, even though this is 2006, we're not using the 2005 standards right now.

Systest (Brian Phillips): Right. They're not mandatory until 2007.

Senator Debra Bowen: So, to my knowledge no state or jurisdiction in the country is currently using the 2005 standards. We're all doing our voting systems using -- It's very confusing out there in real person-land to have a 2005 standard that is not being used in 2006.

Systest (Brian Phillips): Right. It's essentially a way to identify them. We are testing to 2002 currently and again we look at how they meet the 2005 just as an added value to the vendor so they're prepared when the 2005s are mandatory.

In addition to that we have our own methodologies of performing software test engineering. Our Advanced Test Operation Management, ATOM, we have quality systems manuals that guide all the work we do, a detailed set of standard lab procedures, SLPs we call them. Many many documents, many many pages long, of exactly how we perform our testing for the voting world.

And that's the information that is actually looked at as a part as part of our audit process with NASED and the EAC. In addition we have review and report forms, databases, templates, checklists, as Wyle talked about earlier, and software and hardware testing tools.

The voting system standards, I put this together but I think the folks at Wyle talked about them very well. First developed in 1990 as we all know as a voluntary program. And then the new standards refer to those as, that's the 2002 VSS, took about three years to put together and were officially released, even though officially referred to as 2002, officially released in 2003 for us to start working with. The technological changes, advances have forced more changes to the standards, and the EAC charter that came in with HAVA was to include accessibility and language components, additions to the standards.

A lot of the work that's been done by the EAC, NIST that was mentioned, the National Institute of Standards and Technology and support from organizations like IEEE, and other standards, or groups, came up with an approved product of voluntary system guidelines, and those were adopted by the EAC in December 2005.

I put this in here, I'm not going to walk you through this, but we do have a parallel, this is our process for doing the ITA that we can put onto one page. There's a lot of work that goes on in an ITA process. This is the actual process for doing what we call a full qualification test effort, that's both software and hardware. I'll walk you through that.

Our ITA process – by the way, we refer to it as ITA process, it's qualification testing. The vendor receives the checklist from Systest Labs. Now this is the checklist that was referred to by Wyle, 33 pages, that was developed by the three ITAs together to identify all of the requirements in the VSS that a vendor must meet.

Now if they state they meet all functionality then they have to meet all of those requirements, we will test to show that they meet all those requirements. They actually are supposed to complete this checklist to give us an indication of all of the items that their system will meet. They also submit a Technical Data Package, which includes manuals like user guides, administrative guides, training materials and so forth, requirements and functional specifications documents, their test plans, their test cases, and their test results that they've worked with and their source code.

In addition they are to submit their quality assurance plan that they've used internally within their organization and their own configuration management plan. As a part of work, we evaluate whether or not they've got a reasonable QA plan and CM plan, configuration management plan, and are they actually following those, do they actually work with those.

We review the TDP for completeness. Do they have all elements required of a technical data package, and we create a qualification test plan. After the test plan's completed, we do two, we call them audits, it's just some terminology that came out early on. A physical configuration audit and a functional configuration audit and they are done in parallel. The physical configuration audit we look at all the documentation, we review every line of source code, we look at the physical configurations of the equipment that needs to be assembled, put together, we review the vendor's test results, and when those are all reviewed to the requirements of the VSS. The functional configuration audit, the hardware's environmentally tested, we're actually involved in doing the hardware ITA effort. And the software's tested to verify logic results. The logic, the results of the test and accuracy to the requirements of the VSS.

All discrepancies are reported and if they impact the ability of the system to be qualified they must be fixed by the vendor and regression tested by our company.

Afterwards we issue a qualification report, which is then submitted to NASED, now the EAC, their technical committee, they review the report, they can question or ask for clarification and once the report is actually approved a final report is issued whereby the EAC issues a qualification number for a specific product. These are some of the details of the physical configuration audit.

As I mentioned before we look at all the documentation included in the technical data package. We have a form that we complete, looking at each document and that form is based on the requirements within the VSS. As I mentioned that includes a system functionality descriptions, hardware and software design specifications, security documentation, training materials, maintenance guides, user manuals, their configuration management and quality assurance programs. Again all discrepancies are documented and sent to the vendor to be fixed.

We then have verification of the software and hardware configurations. We examine the setup of the system hardware and software components are all documented, components within their TDP. We verify that the document is consistent with the configuration used for the hardware environmental testing, and all discrepancies are documented and sent to the vendor to be fixed.

Senator Debra Bowen: So, they're sent to be fixed. What kind of review is there to determine whether the fixes are fixed?

Systest (Brian Phillips): Every fix that is sent back to the vendor gets resubmitted as a part of an update to the technical data package. So if it's documentation changes they resubmit the entire document and that is reviewed to ensure that the changes meet the requirements and that the fix is actually what's necessary to qualify the system. If it's a change in functionality in the source code or a change in a hardware component the product is retested, completely retested if it's a hardware component they'll rerun the test to ensure that it meets whatever the requirements are for the hardware – and you'll have to forgive me, I'm glad the folks from Wyle are here, because I'm more of a software person, I have other people in the company, Systest Labs who do the hardware, but if the software has a bug or defect of some sort they will fix that bug or defect in their software and they may fix others as well because we may find more than one. Then they resubmit that software to Systest Labs, then we re-review the source code run the software and we run the software through the same sorts of tests that we were running when we encountered the problems and then we also run it through tests in and around that area to make sure that the fix doesn't adversely affect other parts of the software, that's regression testing.

Going on to our source code review process. And this is source code for software, which is the election management system, and the central count, as well as software resident on firmware devices, if we get involved in that. We will identify what code needs to be reviewed. In some cases, we're actually now seeing companies coming to Systest Labs with 2002 qualified systems who are now making updates, adding new functionality, because of certain states are asking for certain features that they don't currently have, so we will look at what's changed on the system versus what was there when it was originally qualified to 2002. More often than not, we're looking at new systems to come through to be fully qualified.

Part of our source code review process is to review the vendor's published coding standards. And we customize our review criteria if it's necessary for the specific programming languages they are using. And then it's an iterative process again, the source code review.

We manually review every single line of source code. In some cases that's 1.5 million lines of code. We look at every single line of code. I've got 15-20 source code review engineers within my company that can work on that. We look for any potential for external modifications of the code, we follow the functions looking for worms, viruses, trojan horses, we check for buffer overflows that could lead to security compromise and the execution of any malicious code. We look for absolute logical correctness, modularity overall construction and conformance to VSS requirements and vendor coding standards. Primarily that's to ensure that the software's at least maintainable by somebody else if they need to take over.

Senator Debra Bowen: Yes, that's a problem we're very familiar with in California. One of the biggest problems we've had with legacy systems is that there's no documentation and so when you continue to use something for 20 years that you intended to have a six year life there aren't a lot of programmers around who still know anything about that, you don't have any documentation.

Systest (Brian Phillips): You remember back to Y2k the scramble to find good competent Cobol programmers at that point in time…We also review the software architecture, the overall architecture of the software to look for any use of systems that detect intrusion.

We also have some tools we run against the source code, automated source code review tools, and these help us obtain certain metrics on logic control and constructs. And we'll also look for certain constructs that are known to be dangerous. The tools have the intelligence built in to pick those out.

But I want to emphasize that we don't rely on the tools solely. We manually review every line of code. I keep getting feedback from academics and so forth they just can't find that possible, how can you do that? We do it day in and day out.

Senator Debra Bowen: I think the question is not whether it's possible to review it, but whether it's possible to find everything that you need to find in the process of the review, and let me give you a couple examples and let me see what your response is.

Some years ago there was a scam run in the electronic slot machines that involved a programmer and a confederate. The programmer wrote into the source code, somewhere, not all in the same – not neatly set out –

Systest (Brian Phillips): Wasn't commented, this is my --

Senator Debra Bowen: Right, "and here's my hack" – no, it was scattered throughout the code so that you wouldn't necessarily know what was happening, but essentially what it did was if a player entered a particular series of bets in an order that was highly unusual in a row, the odds for payout changed. Now, nobody would test to see "if the bettor bets 2, 7, 3, 8, 4, 9, 5, 11, does the system work?" So there's an inherent limitation, and of course that passed source code review under the Nevada gaming regulations because it was embedded in the rest of the source code and then compiled in such a way that it was undiscoverable. However it did cost the casino in question $1.3 million and was caught only because the two people in question were pigs and they stole too much too fast, it was picked up on audit.

But what that really highlights is a question about inherent limitations in a source code review that has 1.5 million lines of code and whether it is conceivable for any human being or set of human beings to find all of the possible either errors or malfeasance.

Systest (Brian Phillips): I mean, and those are valid concerns, as you said, it's a lot of source code to look at. They are broken out into logical chunks within the functions, and for instance the 1.5 million lines of code is really made up of five or six main software applications, for instance. But those are some of the risks for doing that.

And if there are other and better ways, tools and so forth available, we will make use of them. And we've been looking at source code review tools from all the manufacturers that are out there, from McCabe's Metrics on down to some of the open source, to see how, if any of these can be used to our advantage, to really look more closely at the source code, follow things a little more stringently, perhaps pick these things out but it's like trying to, it's like working for Norton virus protection, you're always trying to stay one step ahead of certain things and sometimes the technology, that particular type of technology doesn't keep up as well.

So there are limitations for that, there are absolutely limitations. But what we do is look at every line of code and at least make sure it meets every requirement within the VSS.

And that is a limitation to some extent of what we're doing versus what I used to do in my IV&V days with the Department of Defense. We were chartered with finding all bugs, all defects period. Zero bugs, zero defects for every staged spacecraft software. We're not chartered with that here, we're chartered to show that the electronic voting system is a viable, qualified system that meets the requirements within the VSS.

Is it completely bug free? My test engineers would love to have that charter, but we don't. We don’t have that. We have to show that an electronic voting system can be used in a way that is outlined by the standards.

So there's a bit of a, as a colleague once put it, we can't really determine how big the fire is, but we can hold them to the fire. So once that's established, we can hold their feet to the fire, once it's established how rigorous the standards are we will make sure they meet the standards but what we can't do is create standards of our own and create test requirements of our own.

Senator Debra Bowen: But what happens if your testing folks discover a bug, let's start with that, what happens if they discover a bug? First somebody has to make the determination whether it interferes with functionality under the standards.

Systest (Brian Phillips): Well if it's a bug that – any kind of bug that doesn't allow you to continue to use the system, whether it's a stated requirement or not, that has to be fixed. If we can do it, a poll worker or an election worker in the back office creating ballots can do it. So that has to be fixed. That's not a question.

Senator Debra Bowen: But I think the question arises, certainly there's been a huge amount of discussion about this, not in this kind of forum, but in the last two weeks this country has seen a huge number of problems, Sequoia in Chicago, Hart Intercivic and ES&S in Texas, where machines didn't work right, they didn't tally the votes right.

We had an instance where in Tarrant County Texas where the vote totals from each precinct were added to the whole previous total and that total kept being added into the next precinct with the result that the election night results showed more than 100,000 more votes counted than were actually cast. Fortunately that's a recoverable error because you still have the original numbers, but I think this raises a real question for people, why do machines work correctly in the test labs, get certified by NASED and then have such high failure rates in the field? What's going on?

Systest (Brian Phillips): Well there's a difference between testing in a lab and working in real life. You've got far more components, let's say Travis County Texas may have several thousand polling place devices. We as a test lab may have two or three, that's all that's available to us. We're also looking at, we don't, what I should say is when we're testing it to the VSS we're not really looking at election day processes either, what are some of the controls in place and some of the security measures in place and things such as that. So that'll vary from county to county and precinct to precinct itself so we don’t really look at that. And that can have a play on how things will work out in the field. That is actually something that has been raised by a number of people that, you know, the ITAs ensure that the product functions to the standards, but there's more to it than the standards there's election day and pre-election day processes and post election day processes that we don't look at.

Software is not a simple thing to test, as we all know. We all understand that.

Senator Debra Bowen: If the largest software manufacturer in the world with billions of dollars at stake cannot protect itself from a day-long denial of service attack, that tells us something about the difficulty of doing it because if you were the largest software manufacturer in the world and you had the ability to protect yourself from that, you could anticipate it, you would. So I think that example alone tells us what we're up against.

Systest (Brian Phillips): Right.

Senator Debra Bowen: And the number of lines of code and the difficulty of trying to understand how something at the very front end of a long program might refer to a line that's buried deep somewhere else. Or even something in a comment that's executable that wasn't intended to be.

Systest (Brian Phillips): Right.

Senator Debra Bowen: So you have quite a challenge and I think the question for us is, given that the baseline for voting machinery in this country right now is the NASED number and the test results, is it realistic for people to feel uncomfortable with that process and what do we do either to improve the test process or if it's just an inherent limitation of trying to test two or three machines in a lab versus what happens in LA County where we have 5,000 polling places, each with multiple stations, do we have to have a different kind of system that would let us more accurately field test?

Systest (Brian Phillips): I mean I would, you kind of answered your question. You do want to have something that gives you an opportunity to more accurately field test systems that are going to be hooked together at some point in time. I mean you look at it, you may have two or three, I'm not sure how many polling place devices that L.A. County may have in any particular precinct, what's the maximum number they might have…

Senator Debra Bowen: In a primary?

Systest (Brian Phillips): Or in any election, if they would have two or three or four.

Senator Debra Bowen: Oh no, way more than that.

Systest (Brian Phillips): For a precinct?

Senator Debra Bowen: Yes.

Systest (Brian Phillips): I'm used to Colorado where I see two or three.

Senator Debra Bowen: No I think it's more, you can have easily 10 voting stations in a primary in a precinct where you've got the maximum number of voters. That's just my experience walking into my Los Angeles County polling place.

Systest (Brian Phillips): So that's part of what we'd have to look at, what are the conditions that we could simulate that would show the maximum kind of conditions you would have for a particular state. But then we're getting back, what we're looking at is we're looking at requirements for particular states and our charter is to ensure that the voting system meets a set of requirements that are intended to go across all 50 states without forcing any particular solution because each state has different requirements.

Senator Debra Bowen: Right, and as we've talked about, even within the states, in California we have 58 counties each of which does its own procurement. It's my understanding that in Michigan, voting systems are procured by township. So Clinton Township which has a population of 6,000 or did when I was at Michigan State makes its own decisions about what kind of voting equipment to buy and what conditions to use. Some of what HAVA was designed to do was further the standardization.

Let's finish walking through this, it's very easy because there's so many questions to not have an orderly hearing process in which we walk through the questions in as logical and accurate a way as your presentation.

Systest (Brian Phillips): Where I was going next was that we, within our functional configuration audit we start to look at the vendors, their own system testing and verification specifications. There we're looking at them to see that they do a thorough enough job of testing within their own organization. Anything that we find within just their own testing process and procedures, we will write discrepancies against and that must be fixed. We look for the adequacy of their testing, we look to see if there are any gaps in their testing, we try to identify the overall scope of their functional testing and have they tested to that, we look at the test tasks they have and all their predecessor tasks and have they met those and again the discrepancies are documented to the vendor to be fixed. And the reason I underlined iterative at the top is because once their fixed we come back and we run through this process again.

While the ITAs, at least the software part of it aren't required to do exhaustive testing, meaning we're going to test absolutely everything we can, we believe the vendors themselves own QA organization should be. When I run QA division for companies, when I've had QA divisions for other companies I've worked for, that was our charter. Do the best possible testing you can and get to the level that your industry requires. Banking and finance you can't have any problems with the code. Telecommunications, apparently you're allowed to have all the problems you want, try to get DSL service and so forth. There's a level required for each of the industries. The voting system industry, as the vendor should know, no bugs are allowed.

Senator Debra Bowen: But is that possible? Yesterday the Bay Area Rapid Transit went down for 20 minutes because of a software problem. I guess my question to all three of you is it possible as human beings to achieve perfection in systems that are written, designed by human beings -- assuming there's no malfeasance ever, all we are dealing with is innocent errors. Is it possible? Do you have such things in military circles? Do you have such a thing as zero defect software?

Systest (Brian Phillips): We have requirements to get to zero defect software but, just to give you an example in the 80s I worked on a system 64 K of memory -- it was really not that big, but it did everything, that was the beauty of it – but we were required to, zero defects that was our charter. And I think the first mission there was one defect, that got through. One defect out of this very complex set of software, and it only came about because there was a hardware failure but nonetheless – even the best -- I mean this was a team of 120 aerospace engineers poring through the code. I myself was responsible for maybe 50 lines of code. That was a module in and of itself, so I knew exactly everything that it was supposed to do. So even then--

Senator Debra Bowen: And you spent how much time on those 50 lines of code?

Systest (Brian Phillips): Several years.

Senator Debra Bowen: And now you're looking at one and half million lines of code. And I'm assuming you don't have several person years for every 50 lines.

Systest (Brian Phillips): No.

Senator Debra Bowen: So the complexity is working against you.

Systest (Brian Phillips): True.



Printer Friendly | Permalink |  | Top
Wilms Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 11:04 AM
Response to Original message
1. Fascinating
Printer Friendly | Permalink |  | Top
 
Bill Bored Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Apr-03-06 04:38 AM
Response to Reply #1
14. Please elaborate. nt
Printer Friendly | Permalink |  | Top
 
Amaryllis Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 11:19 AM
Response to Original message
2. Finally it is coming out....
This highlights exactly why private corporations don't belong in the election business:


"The press has stated that Wyle works in secrecy, suggesting some sort of a clandestine or inappropriate operation. The fact is that Wyle is bound by policy and ethics to respect the privacy of its clients. We view the relationship between an independent testing laboratory and its clients as similar to that as between a lawyer and client, or between doctor and patient. As a matter of policy, Wyle does not discuss any client by name, nor will we release any test information or data without written consent from the client. I need to underscore that this is not a practice simply for voting machines, but for every product that we test across the entire the spectrum. Simply stated, the test data belong to the client, and they are not ours to share with third parties. This is standard practice for which all of the independent testing laboratories in the United States which provide a critical infrastructure for our healthcare system, manufacturing sector, and the economy in general."

This is fine for the private sector, but what the hell is this practice doing in the election business?
Printer Friendly | Permalink |  | Top
 
Land Shark Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 11:49 AM
Response to Reply #2
3. If Wyle's relationship is "similar to that as between lawyer and client"
then there's really NO BASIS for such a relationship to have an effect beyond the confines of the lawyer/client relationship.

On reconsideration: Maybe the idea that OJ could meet with Johnny Cochrane, his lawyer, and then come out and pronounce that Johnny Cochrane certified that OJ was innocent!! We could save all the money and expense of an expensive procedural exercise called a trial! :sarcasm:

Wyle is FUBAR. It's undeniable.
Printer Friendly | Permalink |  | Top
 
Amaryllis Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 01:57 PM
Response to Reply #3
6. what's FUBAR?
Printer Friendly | Permalink |  | Top
 
Kurovski Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 05:34 PM
Response to Reply #6
11. FUBAR is an acronym that originated in the military
to stand for the words f***ed up beyond all repair
Printer Friendly | Permalink |  | Top
 
Amaryllis Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Apr-03-06 07:51 PM
Response to Reply #11
17. I see. A technical term. :)
Printer Friendly | Permalink |  | Top
 
Kurovski Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Apr-03-06 08:39 PM
Response to Reply #17
18. Yes. Very complicated. :)
Printer Friendly | Permalink |  | Top
 
Amaryllis Donating Member (1000+ posts) Send PM | Profile | Ignore Tue Apr-04-06 12:13 AM
Response to Reply #18
19. Thank you for enlightening me. Knowing this will enhance the quality of my
life. :)
Printer Friendly | Permalink |  | Top
 
Kurovski Donating Member (1000+ posts) Send PM | Profile | Ignore Tue Apr-04-06 12:27 AM
Response to Reply #19
21. Not to mention the quality of your communication.
It will now be on a "presidential" level. ;-)

Printer Friendly | Permalink |  | Top
 
Amaryllis Donating Member (1000+ posts) Send PM | Profile | Ignore Tue Apr-04-06 12:34 AM
Response to Reply #21
22. Yes, particularly the quality of my communication.
Printer Friendly | Permalink |  | Top
 
Kurovski Donating Member (1000+ posts) Send PM | Profile | Ignore Tue Apr-04-06 01:05 AM
Response to Reply #22
23. Yes.
Printer Friendly | Permalink |  | Top
 
Amaryllis Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 01:59 PM
Response to Reply #3
7. Ummm....unless
"If Wyle's relationship is "similar to that as between lawyer and client"
then there's really NO BASIS for such a relationship to have an effect beyond the confines of the lawyer/client relationship.

Unless you dont want the public or election officials to know what is REALLY going on.
Printer Friendly | Permalink |  | Top
 
glitch Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 08:51 PM
Response to Reply #3
12. Perfect analogy.
"On reconsideration: Maybe the idea that OJ could meet with Johnny Cochrane, his lawyer, and then come out and pronounce that Johnny Cochrane certified that OJ was innocent!! We could save all the money and expense of an expensive procedural exercise called a trial!"
Printer Friendly | Permalink |  | Top
 
WillYourVoteBCounted Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 11:52 AM
Response to Original message
4. looking forward to transcripts being posted
Looking forward to transcripts being posted somewhere that doesn't say
"PERMISSION TO REPRINT.." AND you must link to bbv.org


or else

Ve are Vatching You!
Printer Friendly | Permalink |  | Top
 
kster Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 01:17 PM
Response to Original message
5. K&R..nt
Printer Friendly | Permalink |  | Top
 
Kip Humphrey Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 02:53 PM
Response to Original message
8. Outrageous that this is how we protect our votes
It should also be noted that shortly after our accreditation as an Independent Testing Authority, by NASED, we limited our involvement to the precinct level voting system, that is, the machine that a human would interface with during an election, and the firmware that would be resident in that system.

Wyle does not perform software evaluations at the central count level and does not evaluate election management system software.


In effect, the e-voting SYSTEM remains untested: no reliability testing, no security testing, no integration testing. Therefore, there can be NO test results that speak to the reliability or accuracy or security of the e-voting system. Testing only one piece of the system is just short of a complete waste of time and must necessarily produce results that are deceptive, misleading, and dangerously close to being fraudulent when used to certify an e-voting system.

No wonder these systems can be designed and implemented with undetectable vote manipulation built-in with no one the wiser except those who designed the systems. This accreditation process stinks to high heaven and is nothing less than an open invitation to steal elections. Sorry for my babbling but as a systems integration IT guy, I'm OUTRAGED!!! (but not surprised)

Printer Friendly | Permalink |  | Top
 
Wilms Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 03:19 PM
Response to Reply #8
9. That was one of the snips I found entertaining. n/t
Printer Friendly | Permalink |  | Top
 
Bill Bored Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Apr-03-06 04:23 AM
Response to Reply #8
13. Welllll, if they are testing to the standards,
it might behoove us to see if the standards cover election management systems. I doubt that they do, at least not very well, but GEMS, for example, is certified by someone, so there must be something somewhere in the standards that covers it.

It could just be that Wyle doesn't do EMS testing.

They probably outsource that to the Acme Independent Testing Lab:

Printer Friendly | Permalink |  | Top
 
Kurovski Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Apr-02-06 04:39 PM
Response to Original message
10. K&R. One vote more to move this info to greatest page.
Edited on Sun Apr-02-06 04:40 PM by Kurovski
Printer Friendly | Permalink |  | Top
 
Kelvin Mace Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Apr-03-06 10:14 AM
Response to Original message
15. Bev Harris was banned from here
for her egregious and vile conduct.

She has, by her own admission, burned her source and subjected him to tree felony charges.

Why is this being posted here? Bev has her own web site and yet folks keep posting things here in hopes of rehabilitating her reputation or conning the unwary newbie.
Printer Friendly | Permalink |  | Top
 
Amaryllis Donating Member (1000+ posts) Send PM | Profile | Ignore Tue Apr-04-06 12:16 AM
Response to Reply #15
20. This is important info. That;s why it was posted here. I do not believe
Edited on Tue Apr-04-06 12:17 AM by Amaryllis
the poster had any motivation other than making this important information available. The person who posted this has been following Senator Bowen's work very closely.
Printer Friendly | Permalink |  | Top
 
Kelvin Mace Donating Member (1000+ posts) Send PM | Profile | Ignore Tue Apr-04-06 09:38 AM
Response to Reply #20
24. Perhaps, but a short summary
with a link would have been sufficient.
Printer Friendly | Permalink |  | Top
 
nicknameless Donating Member (1000+ posts) Send PM | Profile | Ignore Tue Apr-04-06 07:09 PM
Response to Reply #24
25. I posted it this way deliberately so that people who's stomachs are turned
by B*v could avoid going to her website. If I had only posted a summary with a link, people would have had to go to BBV.org to read the transcript.

Did you read my intro?
I'm posting ONLY the transcript. And I'm posting it in its entirety, in HUGE 30-minute chunks, rather than just a few paragraphs with a link.

At this point, unfortunately, the transcript is only available at that site. I looked, but could find it nowhere else.

This was State Senator Bowen's hearing -- not B*v's.
And it's important that we know what was said by these ITAs.
Printer Friendly | Permalink |  | Top
 
Steve A Play Donating Member (638 posts) Send PM | Profile | Ignore Mon Apr-03-06 04:56 PM
Response to Original message
16. Thank you Senator Debra Bowen!
Edited on Mon Apr-03-06 04:56 PM by Steve A Play
:kick:
Printer Friendly | Permalink |  | Top
 
DU AdBot (1000+ posts) Click to send private message to this author Click to view 
this author's profile Click to add 
this author to your buddy list Click to add 
this author to your Ignore list Tue Apr 30th 2024, 11:37 AM
Response to Original message
Advertisements [?]
 Top

Home » Discuss » Topic Forums » Election Reform Donate to DU

Powered by DCForum+ Version 1.1 Copyright 1997-2002 DCScripts.com
Software has been extensively modified by the DU administrators


Important Notices: By participating on this discussion board, visitors agree to abide by the rules outlined on our Rules page. Messages posted on the Democratic Underground Discussion Forums are the opinions of the individuals who post them, and do not necessarily represent the opinions of Democratic Underground, LLC.

Home  |  Discussion Forums  |  Journals |  Store  |  Donate

About DU  |  Contact Us  |  Privacy Policy

Got a message for Democratic Underground? Click here to send us a message.

© 2001 - 2011 Democratic Underground, LLC