Larger Expanded Scientific Study Confirms At Minimum Over 200,000+ Mail Ballots With Mismatched Signatures Counted Without Review (“Curing”) In Maricopa County, Arizona 2020 General Election

Larger Expanded Scientific Study Confirms At Minimum Over 200,000+ Mail Ballots With Mismatched Signatures Counted Without Review (“Curing”) In Maricopa County, Arizona 2020 General Election

Is In this discussion, Dr. Shiva Ayyadurai, provides findings from the Election Systems Integrity Institute’s recent audit of signatures on 1,911,918 early voting mail ballots in Maricopa County’s 2020 General Election.

ESII-Signature-Verification-Final-Report-Extended-Study-Updated-Redacted

Key Points

• At minimum, 215,856 early voting mail ballots (EVBs) should have been cured in Maricopa versus the ~25,000 cured by the County in the 2020 General Election.

• This updated Extended Study (“the Study”) along with the Pilot Study are the first to calculate signature mismatch rates of EVBs for Experts – Forensic Document Examiners (FDEs), Trained Novices (non-FDEs), and in a Two-Step Review process using non-FDEs and FDEs.

• One constraint of this Study in not having access to the signature files from the County.

• Given the nearly 10x difference in EVBs to be cured between this Study and the County’s actually number cured, if the County were to provide their signature files, an update to this Study can be performed.

• Maricopa County Election Dept. states it has a “rigorous signature verification process.”

• Of the 1,911,918 EVB signatures verified, the County reported only 25,000 were flagged as signature mismatches requiring review – “curing;” and after curing, the County concluded only 587 of the 25,000 (2.3%) to be “Bad Signatures.”

• This Extended Study confirms the findings of the earlier Pilot Study and concludes that the process used for signature verification in Maricopa is a flawed signature verification process.

Rough Transcript (Auto-Generated)

SUMMARY KEYWORDS

signature, mismatch, ballots, cured, study, people, signature verification, maricopa, analysis, calculate, match, county, novices, systems, election, envelopes, maricopa county, called, pilot study, experts

SPEAKERS

Dr.SHIVA Ayyadurai

Dr.SHIVA Ayyadurai 

Hello, everyone, this is Dr. Shiva Ayyadurai. I hope everyone’s doing well. 

Wait for people to come on in. Today we’re going to be doing an interesting presentation, it comes after a lot of hard work. And it’s a follow up to a previous study that we presented a pilot scientific study we did I think about two or three weeks ago. 

And in that scientific study, we shared with you the the results, the initial results of our findings, about the quality of the signature verification of ballots that took place in Maricopa County. So I think that was on March, sorry, February 22 2022. The election systems integrity, which we founded about a year ago, and we held a major conference a year ago, has now gone into full gear, we’re doing lots and lots of research projects. 

And we publish work based on the work in this case of echo mail. And in that a pilot study that we did, we analyze close to 2 million envelopes that were on the that’s where ballots go in. And our initial study was done for the Arizona Senate that we were commissioned to do. 

And in that study, we were asked to look at whether in the box on the front of the envelope whether a signature existed or not. We weren’t allowed to do the signature verification. Subsequent to that. 

And more recently, using those ballot envelope images and using signatures that we were able to acquire publicly from the deeds repository, Maricopa. We did the first study of its kind in the world. It’s interesting in the scientific literature, no one has done a study of literally measuring the mismatch rates of signatures, if you present them to a, you know, train staff member or an expert, if you give them one signature of someone, which is on the envelopes with people sign when they put in a ballot, versus their signature on file. 

And comparing and doing enough of those to get what’s called a signature mismatch. And owner ever had done that we were the first to do that. We did a pilot study as I was sharing with everyone a few weeks ago. 

And we came up with some extraordinary and compelling results, which said nearly as minimum over 200,000 ballots in Maricopa were counted, but they didn’t go through a process called curing. So I’m going to cover that today. All right, so today, at that time, we did that study, we did a sample size of around 500, which represents in statistical terms 95% confidence. 

Over the last several weeks, we’ve done now a study that goes over 2500. In fact, over 2700, we’ve done two analyses within our what we call an extended study. And that study also confirms the pilot study. 

And I want to share that with you. So I believe this is going to be educational for most of you, I hope you enjoy it.And let me first of all begin by letting everyone know that the election systems integrity Institute is a part of another, you know, larger foundation I have called the International Center for Integrative systems. 

You can find it at integrativesystems.org. And this center, I started close to around I think 14 years ago. 

And our center really focused on educating people on the power of systems thinking. And we run many, many different programs and projects inside of it, anyone’s free to explore that. But if you go to the Center’s website, you’ll see that we’ve we’ve done a whole range of projects on systems. 

So one of the projects we’ve done is on for example, systems biology, the using the systems approach to look at the interconnection of food systems. So we did quite a bit of work on genetically engineered foods. And we discovered some significant problems when the food is engineered and what it does at the plant systems level. 

We also have a out of the systems work out of food. We we run the international clean and raw food certified program that’s also part of the International Center for Integrative systems. For those of you who have no young people or young kids, let them know that that we also have something called innovation corpse. 

org It’s a initiative that I started to really recognize young innovators and their age of 14 or 18. And if you go to the website, we have still grants available, but it ends on March 22 of this, sorry, march 31 of this year, but people young people, 14 through 18 can apply. And our center funds those kids with $1,000 program, they get mentoring for mentoring sessions, but really want to recognize young people. 

And that other part of the International Center for Integrated Systems is the election systems integrity Institute, which focuses on doing publications and research on election systems. And that’s what we’re going to talk about today. Okay. 

And for those of you joining, we’re going to be discussing one of the research projects we just finished on understanding the signature mismatch rates, let me just go right into that. And this is a effort out of the election systems integrity Institute’s I’m going to go right into that so people can, we don’t waste any more time. But here’s the cover of the, of our sorry, program here started up. 

This is a cover of the report that we submitted. We recently updated it, in fact, this morning, we submitted it. And I want to thank our team. 

for it. The version I’m showing you has certain information that was only viewable to the Attorney General of Arizona. But what this extended study confirms at minimum is that over 200,000, male ballots with mismatched signatures were counted without review, which is called curing and you’re going to learn about that in Maricopa County, Arizona. 

And I just want to let everyone know, this process of curing occurs in 22 states in the United States. And so what we think about Maricopa as a case study of potentially what’s going on elsewhere, so let’s just jump right into it. So what you see here, but first of all, this is the entire study, this will be published on our website. 

But the key elements, and it’s a long study over 100 114 pages, which we put together, but let me just go to the executive summary, what we discovered was at minimum 215,856. Early voting mail ballots, which I’m going to call Ed BS should have been cured in America versus the 25,000 that were cured by the county in the 2020. General Election. 

This updated extended study, along with the pilot study that we did several weeks ago, are the first to calculate signature mismatch rates to the best of our knowledge of EVPs. For experts, we’re going to call forensic document examiners, train novices, we’re going to call non FDS. Also, and in a two-step review process using both FDS and non fts. 

Now, one constraint of the study I want to point out to everyone is not having access to the signature files from the county, so we had to go you know, mine and get our own signatures that were publicly available. Now, given the nearly 10x difference. Any VBS to be cured, we found out 215,000, the county had 25,000, between this study in the counties actual number cured if the county were to provide their signature files, an update to the study could be performed. 

So let me provide some key elements of the abstract here. I think one of the things I want to point out here, which I sort of went through quickly, sorry about that is that you can see we calculated different mich mismatch rates based on different conditions, different experiments, we did all the way ranging from as low as 11%, all the way to highest 48.9%. 

And you can see the number of ballots, the study predicts should be cured. We’re in this case, only taking the minimum amounts, I just want to let everyone know we’re being very, very conservative. We’re saying only about a minimum, over 200,000 ballots should have been cured, but there could be higher levels. 

So I want to just let everyone know we’re being very conservative in this estimate. Alright. So just a review. 

For those of you what occurred in the past is that we did an initial pilot study. And in that pilot study, we found out of the 499 AVB signature images that we use in that study, and by the way that represents that’s a randomly selected sample from the 1.9 million that represents 95% confidence, such that the real value would be within plus or minus 4% margin of error. 

Now, in that pilot study, we had six reviewers, three experts, called forensic document examiners and three trained novices were presented pairwise images of signatures from the envelopes, and a genuine, genuine signature. And they all concurred about 12% of the EVPs. This again, minimum or signature mismatches, the pilot study concluded that over 229,000 EBV should have been cured, versus the upwards of 25,000, that the county cured. 

Okay. And though the results from that pilot study were very, very compelling, you know, we decided that it would be important to do an extended study, right? So, you know, in science, you sometimes you know, if you go, you know, they’ll do a little biopsy, or they’ll, you know, you’ll do an engineering, you’ll do a small test, just to get an idea, hey, is something going on there? If it is, then you go do a larger study. So we did the initial study, we found, we thought were some compelling results. 

And now we did this extended study. So that’s where you’re, that’s what you’re getting today. Okay. 

So let me go back here. So, now this study used a sample of 2770 samples, that’s five times larger than the pilot study, and get that gives a margin of error for us around 99%. Again, a 99% with plus or minus 2. 

5% margin of error. So much more higher resolved study, more accurate study. Now this study, user revise sample size, also have 237 nights of forfeits. 

First, we used all 2770. And then we did a revised study to even be more conservative. Let me explain what I mean by this. 

So we took 202,270images from out of the 1.9 million. So that was our sample at 99% confidence level. 

And we put him here, then we had to go get these signatures for those samples, or where did we find them? Well, the county did not give us a signature files. So we went to the recorders database, which is a publicly available database where you can Maricopa where you can find people’s deep signatures. And we mined those. 

So we have two signatures side by side that someone could say, one of the A, our study could be inaccurate, because the signature that we got from the counties, was that really accurate? So we did two sets of studies. And I’ll show you what we did here. We first did all 2770. 

And that’s called analysis a. But we also did another analysis, because what we discovered was when we did the first analysis, there were about 290 signatures where everyone said, Hey, these are definitively bad signatures. Everyone agreed these are mismatches to be specific. 

Okay, so we said, Okay, if there are mismatches, that’s pretty amazing. Or it could be that we got the wrong signature. So guess what we did, instead of keeping them to make our accuracy improve and make people more confident, we said, we’re going to eliminate that. 

Okay, and still see how it is. And that was a second analysis we did. And we also took out some others that we were even more being paranoid. 

Okay. So out of 2770, we removed around 300. And I think 91. 

And that was 2379. And we did a second analysis, and I’ll share with you that so we really did two analyses within this study. So let’s go right into that. 

So what we found was that I’ll share some of the so so let me just go right into so what we what did we find the summary we found that if experts, forensic document examiners, that’s what they’re called FDS alone, were used to review the EVPs, then at minimum 786,753 EVPs should have been cured, or at a maximum 936,457 EVPs. These were if experts reviewed remember, experts are a lot more pay a lot more scrutiny in these counties that do signature verification, they have trained staff, but the training they get is a few hours, okay. It’s not like their forensic document examiner. 

So we also looked at how the trained novices would do on our end, right, non fts. And in their case, they would have found at minimum 344,528 EDB should have been cured, or at a maximum 544,897. Now, the next thing we did was we emulated what occurs in Maricopa where if a ballot for say, of the train staff volunteers, you know, people are not FTS, who find a signature mismatch. 

It then goes to what is called a manager, someone with more expertise. And if they also say it’s a mismatch, then again sent to curing hearing is where They call people and I’ll go through this more to say, hey, maybe the person had Parkinson’s or they had a difficulty, and that’s why their signature is off. So it’s really a two step initial review process. 

So we simulated that by doing this two step process. So we had really three signature mismatch rates we calculated for both analyses, okay. And so let me go through that. 

So what the study revealed, at minimum is, you know, overturn the 1000, early voting mail ballot should have been cured. So this confirms the pilot study. And what we want to really bring out here is if the county were to provide us its signature files that they use, so we didn’t have to go do all this hard work of mining signatures from the public, then we could update this study, and also use machine algorithms to do a full analysis. 

Okay. So let’s begin with the background. By the way, this is Dr. Shiva, I hope for those of you who just joined us. This is a a review, where I’m sharing with you the results of our extended study following the pilot study ofcalculating the signature mismatch rates are essentially really performing the first study of its kind in an extended way on signature verification. That’s what you’re joining. 

So welcome, everyone. So the background to this is let me give you a little bit of education because some of this stuff may be new. So let’s understand what is signature verification, what is signature verification, one signature verification is, first of all, it’s a multi step. 

It’s really a systems process. And it’s aimed to verify a signature based on review of two signatures side by side, one being the genuine signature, the other being a questionable signature. And in elections, what happens is, people put their envelopes, okay. 

People put their ballot I’m sorry, in an envelope in early voting mail, ballot envelopes. And it’s sent to a facility typically in the election office, where it’s scanned, so the envelopes scan to create an envelope image, then what happens is initial review is performed, before you even open that envelopes. To determine if the person who’s signed is is in fact the person who they say they are. 

Okay, so how is this done? Well, side by side, human beings review the ballot envelopes image, okay, the envelope image, which has a signature, and they also have a genuine signature on file. Now in Maricopa County, they use this. They have signatures on file, perhaps a person’s voting registration signature, or their DMV, motor vehicles. 

And human beings reviewed all nearly 2 million or over 1.9 million envelopes. Okay, and they did this review. 

So first, the train staff reviewed right. And if this was found to be a match, we’ll talk about what happens there. If it was found to be a non match. 

One of the things they did in this case is they send it to, as you can notice in the sub bullet here, to a manager with more expert expertise to determine if it should be cured, okay. So if it’s a match, if the envelope is open, invalid as process, if it is not a match, then it goes through curing and in curing. If it’s found to be a match, then the ballot is sent to tabulation as it would be in the first step. 

And if it’s confirmed not to be a match, then it’s denoted as a bad signature. Okay, so all of you joining us, this is a signature verification process. Maybe there’s variations that you’d bipartisan people, etc. 

But this is really the process. Okay. It’s a multi step process. 

All right. in Maricopa here, were the results from the 2020 election. Again, this is the extended study we’re sharing with you. 

This is not the pilots have you saw a version of this before, this is a much more intensive study that we did. Okay, so we see here is that out of all the 1.1 9 million ballot plus ballots 25,000 were sent to be cured. 

And that’s about 1.31%. Now, out of the ones that were cured, I mean, sent to curing 587 We’re finally decided to be bad signatures, which is 311, hundreds of all the VVS all 1.9 million, and it’s about 2.3% of the key word. Okay. 

So what did we do? Okay, what we did was first of all, step one, we selected a representative sample and we wanted in the in the extended study have a very high confidence level. Okay, so we selected a sample have to have a confidence level of 99% such that the margin of error would only be plus or minus two and a half percent, okay. And to achieve this, we needed 2775 times more than the pilot study. 

That’s what we got. So we organized First, the second set the data set of the envelopes signatures with their images, which is we had 2270 of them. And then we now had to create a dataset of the genuine signatures. 

So the county in Maricopa did not give us the SEC genuine signatures. So what we did was we went to the Maricopa recorders deeds repository. And so if your name was John Smith, we found John Smith’s deed. 

But remember, there could be many John Smith’s How do you know John Smith is John Smith? Well, it turns out, if you use a middle initial, it really hones it in. And if we weren’t able to find the middle initial, let’s say the minute initial first name and last name were there, then we accepted. But if we couldn’t find that middle initial, then we also look for the address. 

And if we couldn’t find the address match, then it was thrown out. Okay. So it’s a multi step process, using both human and technology. 

Now, if the county were to give us their stuff, we could update the study. So I just want to let everyone know, our constraint here was, we had to go in mind genuine signatures. But however, is this, as the slide says, Many forensic document examiner say many times signatures on a deed are much more reliable, because you have to have a notary do that. 

Okay, versus signatures when you sign a voter registration file. All right.So that was that. 

So next thing we did was, we had two sets of people, as I mentioned, forensic document people and non forensic people, as we talked about here, okay. And then they were presented with these, an image on the left, which was the signature on the envelopes, and a genuine signature that we mined, and they had to do two choices, either select, either select that it’s a match, or it’s not a match, match or not match. Okay. 

All right. So then what we did was, in step five, what we did to put it simply, we not only did that for the 27701, we got results, which is called analysis a, but then we wanted to make sure because some people may critique us, they may say your genuine signatures may not be really genuine. So we did an interesting thing. 

We said anytime all six people said it was a no match concurred. We could tell Wow, six people can occur to us no match. That was in the first analysis. 

And the second is we said, let’s throw them all the way. Let’s assume that when everyone says it’s a no match, it means a signature that we got is absolutely wrong. So we did that. 

So we throw away 290. Okay, close to 10%. And then we also found we, we were a little more, you know, deliberate on looking for better a male, the middle initial, and we got rid of another 101. 

So in the second analysis, we took the 2770. And we threw away 391 to get a sample that had potentially less error of signatures that were wrong from the deeds. Okay, so we really, really wanted to be, you know, as conservative as we could. 

So that’s what I wanted to say, again, if the county gives us their signatures that they have, we would have had to do a lot of this hard work. Okay. So let’s jump right into it. 

Okay, so the first analysis we did was on the sample of 2770 samples. All right. And we did the first experiment where within analysis, say, where the goal was to determine the signature mismatching rates, using experts, forensic document examiners. 

So we got three three forensic document examiners, and they were presented the pairwise images, right of the 2770. And then we calculate what’s known as a pooled consensus mismatch rates. Okay. 

We’ll talk about what that is. It means the probability out of how many times among all the three FTEs that when they saw the same pair of signatures associate with an ABB, did they conclude as a match or no match? Okay. And we did that at each level. 

It’s almost like three people voting, okay. And we aggregated all that. And the forensic document examiners oops, over here. 

They were they were there to calculate, okay, for each pairwise signature, they were presented, and then we determine the distribution probabilities. And then we also determined that the probability the mean of the properties across a 270 Determine the FD pool consensus rate. So this was the data for each expert so you can see that just look at that for a little while. 

And you’ll see each expert had a range from 23% of this EVPs being mismatches all the way up to 71%. And that’s where each FTS the Blue is their match rate, and the red is their mismatch rate. Okay? All right, so they have very mismatch rates. 

And then this is if you want to see how their mismatch rates varied over time as they’re processing those 20 727 770. You can also see that, all right, then what we did was, we wanted to now figure out the pool consensus. So for every ballot, okay, from one to 2017 70, we calculate, that’s what each one of these lines is. 

The votes essentially, if one person of the three said it was a mismatch, that’s one out of three, right? 33%. If all three said it, it’s 100%. That’s why the scale here goes from zero to 100%. 

All right, so we have this literally this histogram. Alright, so these are literally the probabilities of that. And then we calculated what’s called the pooled consensus signature Miss metric, which I’m going to call beta. 

And we found that to be 48.98%. That means, among all the FDS, the the experts 48.9% 48.98%, on average, they would say, hey, this early voting ballot has a mismatch. But remember, experts are seeing things that non experts do not see. 

All right, so that’s the first thing we did. And we call that variable beta. Next thing we did was we said, okay, let’s just group how they, you know, voted. 

So these are their voting 100% time when they’re saying it’s a mismatch to 0%. And you can see the distribution of ballots, okay. Just nice graph here. 

So at the end of the day, if you use the FDS, to have cured, they would have cured 48.98% or 936,457. ballots, okay, experts. 

Then we said, Let’s do with train novices, non experts, non fts. And in this case, the county has a guide, and that guide was followed. Okay. 

And again, we presented them with the same set, they also did the pool consensus, same process. And here are those results. So you see ranges from 23. 

1%, to 31.2. These two novices are trained novices, we’re very, very close. 

And you can see their results here. Okay, but bottom line, it’s in this range. And then we did, we did the same thing for calculating the pool consensus. 

Again, every line here is a probability, it’s literally a vote for literally what they did. It’s not a probability, the actual results of each non FD are trained novice how they voted on each ballot. And that helps us calculate a different signature mismatch rate for the novices, which we call alpha. 

And that turned out to be 28.5. Okay, and again, you can see a distribution of their votes across the various ballots. 

So this means 1481 ballots, none of them said was mismatch, which means they’re all matches all the way down to 363, which they said, all three said that they were mismatches. Okay. And then in between, alright, then what we did was we calculated the ballot. 

So non FDS, if they were allowed novices trained novices, they would have said close to a half a million or over half 1,544,008 or 97, should have been, should have been sent to query. Right.Then we did something more interesting. 

Remember, what really happened? So so the case we’re looking at is if just the FDS did it over 900,000, they would have said would have cured if just the novices did it over 500,000. But in the election process, a curing signature verification apparently America, it’s a two step initial review process. First, the trained staff, in this case, a trained novices review it, and if they say, Hey, this is a mismatch, it goes to a expert or the manager, and then the expert reviews those again, so it’s really a joint joint really function a joint probability. 

Okay, so we did that in the extended study a little more sophisticated than we did in the pilot. Okay. So here we go. 

So in Maricopa, again, the initial view involved train staff. And that’s what I’m just describing here that we did. And this is another way to view it. 

Just so people don’t get scared with the math here. It’s actually pretty simple, but just to keep it straightforward. He represents all the ballots. 

It’s really a unit vector of all the 2770 ballots that came in, and then these are reviewed by the train staff and there’s a signature mismatch rate of alpha. So it comes out This that would be sent to the FDS would be E. It’s actually multiplied by alpha. 

But sometimes we will call it dot product, cuz these are actually two vectors, okay? But these number of ballots, now need are no matches and need to be reviewed by a manager. The manager has their own signature mismatch rate, which we’re calling beta, which we calculate it, and they would come up at the end, you’d get so many ballots would be E times gamma, and gamma is the combined mismatch rate, but in joint probability to be specific among both the train office and the the ft. Okay, so we calculated that, okay, so it’s quite a bit of work, but I’ll walk you through it. 

So what we found was if you calculate the ballots in this first case, that’s coming out right here, E times alpha, that turns out, it’s seven or 90, early voting ballots would go to the managers for review out of the 2770. Okay, that’s right about there. Then, we looked at in order to calculate gamma, we need the joint signature view mismatch, right? So we plotted that, and then we calculated gamma, which turns out to be 22.27. Okay, 22.27. 

And if you go to that 22.27, if you notice, here, we look at the 22.27, we get various possibilities, because now two people are voting on this. 

Alright. And you see here that 617 ballots would have gone to curing because this is how many came through both people reviewing it. And if you work that out, that’s 22.27, which means 425,784. So what this means is that if we followed the two step process in America by using both people, this would mean that over 400,000, early voting ballots should have been cured. Okay. 

Now, obviously, people would say, hey, well, this seems high. And the reason your study is flawed is because you didn’t use the same signatures that the county did. Remember, we went use the signatures that we got from the deeds repository. 

So let’s again, if the county gave us their signatures, this would not be an issue we would use. Just use that. Okay. 

So we sort of stepped back and we said, okay, let’s be even more strict, maybe some of those 2770 pairwise things, we should throw away some of them, maybe some of them that we, you know, whatever we suspect could be, the genuine signatures can be bad, let’s throw them away. And then, you know, redo the analysis. So that was really analysis b. 

Again, we were being conservative here.So I’m, by the way, just to summarize the analysis a, these are the different signature mismatch rates, and these are different ballots that have been cured. Okay. 

So then. So that’s what the expert summary showed that I just walked you through. What we now that I did was pursuant to what I just said, we applied these additional credit constraints, we were really being sort of paranoid, hey, let’s remove any of those potential genuine signatures we got that could potentially be bad. 

So how do we do that? We did to some very interesting things here. We used we, we removed another 101, where we did further scrutiny where we looked at the middle initials and, and we did more scrutiny on the middle initials matching. And if the middle initials didn’t match me, again, check the addresses. 

But so we found about 101 there, and again, being conservative, but then we said, let’s look at all the ones from the previous analysis that the that the FDA said, we’re all three said we’re not matching. That’s 582. And let’s look at all the ones that the novices trained officers said we’re not matching. 

That’s 363. And then we said, how many of them do both agree? All six agree, how many did they agree we’re not matching? Okay, now, what could that mean? So, so experts and novices are saying that 290 pairs do not definitively match. So we said, Okay, why, what about saying, it’s called the wisdom of the crowds. 

In statistics, we said, why don’t we say all of those are bad, right? It’s could be two possibilities. The genuine signatures we got are bad, or they’re indeed mismatches. But we took a conservative case, and we said, let’s just toss them all out. 

Okay. So again, giving the benefit of the doubt to our critiques who may say, Hey, you got the wrong genuine signatures. So we did that. 

So we took out all those 290 Okay, from here. So, at the end of the day, we removed 391 pairwise signatures and we ended up with With a new sample of two, and 2379. Okay, so now there were what’s interesting is among those 209, just to let everyone know that we removed, there were some that were so clearly genuine signatures, because we had the address match, middle initial out of that 290 pool, but we threw them all away, just to let you know that we’re being concerned, we even threw away definitively genuine signatures, okay? But that’s fine. 

Because we want to be, you know, lower our probability of error. So now, I can’t show you these because these are constrained because their signatures here. So let’s show you the summary of the updated analysis. 

Okay. This is analysis B. So now in analysis B, what I’m sharing with you is we did analysis a, we came up the range of potential ballots, now we’re doing a more conservative estimate, where we’re throwing away anything that could potentially be not a genuine signature from the deeds. 

Okay. All right. So what do we have here? So we’re looking at 2079 samples. 

And again, we ran ran experiment one, like I’ve talked about, we did the pooled canal analysis. And this is what we find in that in the new analysis be, we find the ranges go from 12.4%. 

So they lowered to 66%. Okay. And those are the mismatch rates by F D. 

S, forensic document examiner still high, but lower than analysis a, and this is their distribution of for each individual AVB, how they voted on them. Okay. And you can see, in this case, the beta is 41%, in the earlier case was close to 49. 

So it’s dropped by 8%. Okay. But so 41. 

15%. And this is, again, the group consensus probabilities of those, and that we find here is, in the second analysis be if the experts reviewed these EVPs, 786,753, what would have been sent for curing, okay.Then we did experiment two with the novices. 

Alright, just like before and analysis, a known analysis B for doing this. And here, everyone knows we did the same process, same pool consensus, as I’ve talked about, and this is what you find, we find that among the non forensic document examiners, the rate goes from 12.7% to 21. 

4, again, lower than the original analysis, a, and again, we did the distribution to calculate the pooled consensus, right, and we find that to be 18.02%. What does that mean, then just like before, that means if you gave a signature to a trained novice 18% of the time, they would say, hey, that is a no match, and they would send it to their manager, just like, if you gave something to a expert, they would say go back to this 41% of the time, that ballot is a no match. 

Okay, so experts 41.15% And, and novices trained novices 18.02%. 

Alright, so now we want to now put it all together. So we did the same thing, where we, that’s 18.02. 

And by the way, that would mean 344,528 ballots would have gone to hearing if you use a novices rate. But now we said let’s apply the two step process. And in the two step process, we know as we talked about Maricopa has trained staff do it, then they send it to the manager. 

So we did the same thing. But notice this time is 2379 pairwise signatures. And so first, we calculate how many would have gone to the manager, which is E times alpha, and that is found to be 429, EBVs, less than before, okay, which is what we would expect. 

And then we want to calculate what is the two step mismatch rate among including both the novices and the FTS and we find that to be 11.29% 11.29%. 

So that means 269 ballots at the end of the day would have been sent to cure cure. Okay, so we put it all together. 11. 

29%. Again, this is a very, very conservative number, because we’ve now eliminated any ones we were potentially thinking we’re not genuine signatures, and we’re doing redoing the analysis. Okay. 

So what do we find? We find in the most conservative case 11.29% 215,800 56 ballots should have been cured. Okay. 

All right. So let’s compare that now. So what that means is that when you line up in this second analysis, more conservative analysis, it goes from 11% to 41%. 

But we only use the 11.29%. To put forward that 200 At minimum 215,856 ballot should have been cured, that is far more than what the county cured, which is 25,000. 

Okay. So that’s the so discussion wise, you know, the county cure 25,000, which is 1.31%. 

And what we’re saying here is that 11.29% should have been cured, which is about 10% 10 times more, okay.Let’s see this. 

One second here.Okay. Someone says, Uh, alright, so that’s what we did here. 

Sorry about that. And so this is what we find. So the discussion is the following. 

If based on the Extended Studies you’re seeing here, that yields a minimum signature mismatch rate of 12.9%, and the county’s post curing mismatch rate of 2.3%. 

That means 4965, EVPs. At minimum, that’s what see right here, should have been thrown out versus 587. So that’s important in such a close race. 

All right. And the conclusion that we have, that we want to put forward is the Malika Maricopa County Election department states that it has a rigorous signature verification process, if you read their stuff, they say they have a rigorous signature verification process, but they only care 25,000. And our extended study confirms a pilot study and puts forward that they actually have a flawed signature verification process. 

That’s our conclusion here. And we want to again, make it clear the extended study found if FD DS alone, were used to review it, they’d be at minimum 786,000. If the non FDS we did it would be at minimum 344,000. 

And it both did it through the two step process be 215,000. Now again, I want to again state this one constraint of the study is not having the signature files from the county. Now, given the nearly 10x difference in EVPs, to be cured between this study in the counties, actually number actual number cure, actually number appears to be actually procured. 

If the county were to provide their signature files, and update to the study could be performed. So we’re more than ready to do an update. And there are a number of things that we propose for future research, but which I’m not going to get into, but essentially saying that we could continue this study in a number of ways. 

So anyway, I hope this was helpful. I know, it’s, it’s a lot of detail. But there’s no other choice sometimes, you know, to bear with it and dig deep. 

And that’s what we’ve done. You know, the election integrity movement. Right now, in my view has three different things going on. 

One set of people deny there’s any problems. And there’s major institutes at Harvard, at MIT, and Stanford, who are all about denying there any issues they write of academics writing all these papers. That’s one hand. 

The other hand is you have what I call Grifters. The grifter spend all of their time, frankly, talking about nonsense, okay? Nothing burgers. And it’s unfortunate because there are real systems issues. 

There are real systems issues, which is what our institute was created for to really go after from a scientific perspective, those systems issues. So electionsystemsintegrity.org was founded on that. 

And we have some incredible people working working with us. And our goal is really to challenge the existing academic establishment, which believes Nothing to see here, move along, right. That’s what’s happening. 

On one hand, one set of the academic institution wants to hold on trying to capture this here. Okay, here we go. Let me just I want to bring back the election systems integrity site. 

Okay, there you go.Ah, okay. Well, I guess I can’t bring that back. 

Anyway. The bottom line is that you have these two groups, the Grifters who are essentially putting out garbage and they’re all They’re helping the deniers. But when you take a systems approach, it truly systems approaches we’ve done here, where you look at the entire system flow, and you go down step by step. 

Now the problem is a systems approach. And when you do real stuff, it takes a lot more effort. I think over the last four weeks, I, myself and my colleagues have gotten about three hours sleep a day doing the analysis, when you do the real work, you don’t have time to do the grifting, unfortunately. 

Right. So this is why a scientific study like this is really important, is because we’ve gone down, and we’ve looked at things various ways we question ourselves, we critique ourselves, we want criticism. And what we’ve concluded here is this study, as confirmed a pilot study where we dipped our toe in the water. 

And now we did five times more samples 99% confidence, two and a half to two 2.7 margin of error. And we’ve concluded is at a minimum of 200,000, early voting ballots should have been sent to curing. 

And the other point is, we are extremely open to working with Maricopa County, if they give us a signature files, maybe our genuine signatures are all crap, fine. We’ll run it through. And we’ll see what results we get. 

That’s how science works. Anyway, I hope this was valuable for people. Let me see if there’s any questions people want me to answer. 

Let’s see here. Someone Thank you, Jason. Gan, for your kind comments. 

Someone is saying thank you for your work. Encouraging people look at the system integrity. Okay. 

You’re welcome. Appreciate your comments. And let’s see here. 

Lot of comments here to go through okay. All right. So I think I think I hope this someone here says, Many election laws were broken, it is a known fact, the scientific aspect is just icing on the election fraud cake. 

Okay. So anyway, I hope this is valuable. This is Dr. Shiva, please go to VA Shiva calm if you want to find out more what I do. And by the way, I have a little scrolling thing I want to mention for if there are young people that you know, between the age of 14 through 18, and they’re interested in applying for the innovation, corpse award, again, we want to recognize young innovators. We award them $1,000 Check. 

We give them mentoring for mentoring. We recognize them but it’s really to recognize that great innovations can occur anytime, anyplace by anybody. All right, everyone. 

Thank you. Be well have a good night. And I hope if you have any dinner you have a good dinner because that’s where I’m going to right now. 

Thank you.

Categories: