Prime numbers have been a challenge to people for a very long time. If they are something that we still can't write an equation for, perhaps it is time to challenge our assumptions about these fascinating and extremely important numbers.
First, it is commonly believed that there is no pattern to prime numbers. Research studied under this belief might very well be simply falling prey to self-fullfilling prophecy. We know that prime numbers do indeed have a pattern. How? Because color-coding a Pascal Triangle to look like a Sierpinski Gasket shows that prime numbers form perfect fractals, while composite numbers show overlapping fractcals. I discussed this a little while ago. Using this method, with just a couple sentences of explanation, ANYONE can look at a picture (mod some number) and tell you if it is a prime number or not (primes look clean, composites look cluttered). The pattern is visually obvious.
Second, let's tackle the concept that '1' is not a prime number... Some people will tell you that '1' isn't prime because it doesn't fit the rules. Others because it isn't convenient. Still others say it was due to the Greeks thinking number games weren't as challenging with it. Hell, you are probably wondering why we even care... Simple: If '1' IS a prime number, then all the equations that don't quite work change... the math changes... We get rid of "except 1" or ">=2" type of exceptions... but most importantly, because it would force us to look at things differently -- which is what we desperately need if we are going to 'solve' primes.
So, what about the argument that it doesn't fit the rules? What is the definition of a prime number? Generally, people assume it is "any number greater than 1 that has only 1 and itself as factors".. So 1*3 is prime because no other number is a factor. 1*4 is not prime because 2 is a factor. 1*1 has no other factors, therefore it should be considered prime. But, you also see why people have a problem with it. Because it is also a square ;) They always say "1 and itself"... Well, under that definition '1' should be prime. The only reason they say it isn't is because they add the clause "greater than 1" or "except 1" to the rule. Occam's razor fans should be well aware of the problem here. You don't add a single exception (for a number that would otherwise work) just because you don't like or understand it.
You are probably still wondering why we care... Let's say you are writing an equation... f(n)=nth prime. Well, if '1' is prime, then all the current attempts are actually doing f(n)=(n+1)st prime... thus patterns may be easier to see if we quit assuming 1 can't be prime.
What other assumptions do we have? Primes get further and further apart... Not true... No matter how high you go, there will be twin primes. This tells us that twin primes are not a special case or exception -- but are in fact part of the pattern. And don't forget what we said earlier -- it is a fractal -- so it repeats forever.
In fact, the whole concept that there are 'types of primes' is simply a misnomer. Each type of prime is simply a subset of the fractal.