Creating a Measure for Primality or Compositeness The search for prime numbers has fascinated mathematicians for centuries. Various approaches have been taken. No perfect method has yet been discovered. Perhaps one of the problems is the definition of prime itself. Prime suffers from having a negative definition: a prime number is a number having NO other factors than one and itself. Prime numbers are all the whole numbers which are NOT composite. Composite numbers are any whole numbers which has factors other than one and itself. This is also a weak definition. This problem might be addressed by one of two approaches. 
Written 2001 Formatted 2010 

[1] Create a new definition for Prime: A prime number could be defined as any odd number that can be written as the difference of squares in one and only one way. [ e.g.: Prime: 3 = 4^{2}  1^{2}, 11 = 6^{2}5^{2} only, vs. Not Prime: 9 = 5^{2}4^{2} = 3^{2}0^{2 }Another way to state this is any number that appears only in the difference of one diagonal. This definition has two problems: it misses 2, and its still a weak definition. But it might offer new insights. A generated prime number might be definable as any number that may be found using the generalization of Euclid's proof. This has the classic problem of missing some primes but finding some composites. It gives a new approach. Try generating your own definitions for primes, or for numbers with similar qualities as primes. 


[2] Create a new measure for primality or compositeness: Rather than defining numbers to be either prime or composite, we could create a measure to determine how composite any number actually is. This might create new patterns to study or new ways to view compositeness. Here are a few simple attempts to measure compositeness: [A] Factor Count Approaches Counting the factors obviously has merit. This creates a bias towards large numbers: e.g.: 1440 can have a lot more factors than can 12. One solution would be to normalize each count by dividing by the number itself.
By this method a prime will have a value of zero, and the more composite a number the larger this value will take. In the graph, we can see that out of the first 36 counting numbers 24, 36, and 12 are the most composite. [B] Factors vs: Nonfactors Instead of normalizing to the number or its square root, we might normalize the factor count to the numbers which are not factors. Or similarly, the square root of the numbers which are not factors.
[C] Factor Sum Approaches Instead we might look at the sum of the factors. a prime has no factors (other than 1 and itself, which to avoid redundancy we don't add). 12 has factors 2+3+4+6=15. Again, to avoid biases towards high numbers we normalize by dividing by the number itself.
[D] Factors within a given region.


