Tuning-Math Digests messages 5975 - 5999

This is an Opt In Archive . We would like to hear from you if you want your posts included. For the contact address see About this archive. All posts are copyright (c).

Contents Hide Contents S 6

Previous Next Last

5000 5050 5100 5150 5200 5250 5300 5350 5400 5450 5500 5550 5600 5650 5700 5750 5800 5850 5900 5950

5950 - 5975 -



top of page bottom of page down


Message: 5975

Date: Tue, 14 Jan 2003 11:13:07

Subject: Re: Notating Pajara

From: Graham Breed

Gene:
>>minimax 706.8431431

Manuel:
> Could this be wrong? I have 709.363 in the scale archive.

I agree with Manuel


                         Graham


top of page bottom of page up down


Message: 5976

Date: Tue, 14 Jan 2003 22:34:36

Subject: Re: Nonoctave scales and linear temperaments

From: Carl Lumma

>>Because complexity is comparitive.  The idea of a complete
>>otonal chord is not less arbitrary.
>
>right, but whether a particular mapping is more complex
>than another shouldn't be this arbitrary!

I'm lost.  If you agree with that, then what's arbitrary?

>>Oh, crap.  What is it you've been pushing, then?
>>Weighted complexity?
> 
>that's been much more common, yes.

Ok.  Here's my latest thinking, as promised.

Ideally we'd base everything on complete n-ads, with
harmonic entropy.  Since that's not available, we'll look
at dyadic breakdowns.

If you use the concept of odd limits, and your best way
of measuring the error of an n-ad is to break it down
into dyads, you're basically saying that ratio containing
n is much different than any ratio containing at most n-2.
Thus, I suspect that my sum of abs-errors for each odd
identity up to the limit would make sense despite the fact
that for dyads like 5:3 the errors may cancel.

If we throw out odd-limit, however, we might be better off.
If there were a weighting that followed Tenney limit but
was steep enough to make near-perfect 2:1s a fact of life
and anything much beyond the 17-limit go away, we could
have individually-weighted errors and 'limit infinity'.

We should be able to search map space and assign generator
values from scratch.  Pure 2:1 generators should definitely
not be assumed.  Instead, we might use the appearence of
many near-octave generators as evidence the weighting is
right.

As far as my combining error and complexity before optimizing
generators, that was wrong.  Moreover, combining them at all
is not for me.  I'm not bound to ask, "What's the 'best' temp.
in size range x?".  Rather, I might ask, "What's the most
accurate temperament in complexity range x?".  Which is just
a sort on all possible temperaments, first by complexity, then
by accuracy.  Which is how I set up Dave's 5-limit spreadsheet
after endlessly trying exponents in the badness calc. without
being able to get a sensicle ranking.

As for which complexity to use, we have the question of how to
define a map at 'limit infinity'. . .  In the meantime, what
about standard n-limit complexity?

() Gene's geometric complexity sounds interesting (assuming
it's limit-specific...).

() The number of notes of the temperament needed to get all of
the n-limit dyads.

() The taxicab complexity of the n-limit commas on the harmonic
lattice.  Or something that measured how much smaller the
average harmonic structure would be in the temperament than in
JI.  This sort of formulation is probably best, and may in fact
be what Gene's geometric complexity does...

>>>f(w3*error(3),w5*error(5),w5*error(5:3))
>>>
>>>where f is either RMS or MAD or MAX or whatever, and w3 is
>>>your weight on ratios of 3, and w5 is your weight on ratios
>>>of 5.
>> 
>>Thanks.  So my f is +,
>
>are you sure? aren't there absolute values, in which case it's
>equivalent to MAD? (or p=1, which gene doesn't want to consider)

Yep, you're right.  Though its choice was just an expedient
here, and it would be the MAD of just the identities, not of
all the dyads in the limit.

Last time we tested these things for all-the-dyads-in-a-chord,
I believe I preferred RMS.  Which is not to say that MAD shouldn't
be included in the poptimal series.

-Carl


top of page bottom of page up down


Message: 5980

Date: Wed, 15 Jan 2003 16:48:02

Subject: Re: 103169 etc.

From: manuel.op.de.coul@xxxxxxxxxxx.xxx

Probably only of marginal help, but it took quite some time to run on
my PC:

    1200.0000 cents divided by 103169, step = 0.0116 cents
Nearest to 5/4    : 33213, 386.3137 cents, diff.  0.000378 steps,  0.0000
cents
Nearest to 3/2    : 60350, 701.9550 cents, diff.  0.003763 steps,  0.0000
cents
Nearest to 7/4    : 83294, 968.8259 cents, diff.  0.000046 steps,  0.0000
cents
Nearest to 11/8   : 47399, 551.3168 cents, diff. -0.100663 steps, -0.0012
cents
Nearest to 13/8   : 72264, 840.5316 cents, diff.  0.334719 steps,  0.0039
cents
Misfit numbers M1-M5  :  0.0000  0.0000  0.0000  0.0000  0.0000
Relative errors R1-R5 :   1.505   0.828   0.558  10.485  35.166 % of
average
Combined error factor : 0.0000 (3 - 7)
Combined error factor : 0.0028 (3 - 15)
Highest harmonic represented consistently     : 16
  highest error 13/11 : 24865, diff.  0.435382 steps,  0.0051 cents, level
1
Highest harmonic represented uniquely         : 398
  highest error 109/86 : 35276, diff.  0.499875 steps,  0.0058 cents, level
1
Highest harm. represented uniquely inv. equiv.: 396
  highest error 109/86 : 35276, diff.  0.499875 steps,  0.0058 cents, level
1
Consistency levels : 3: 132   5: 132 7: 132 9: 66 11: 4  13: 1  15: 1
Diameters          : 3: 51584 5: 229 7: 44  9: 31 11: 15 13: 11 15: 11
Consistency 16 region : 103168.89683 - 103169.15475 tones/octave
Number of possible Pythagorean generators : 91840
Pyth. maj. third  : 35062, 407.8202 cents, diff.  1849.000 steps,  21.5065
cents
Pyth. dim. fourth : 33045, 384.3596 cents, diff. -167.9996 steps, -1.9541
cents
Basic fifth       : 60182, 700.0010 cents, diff. -167.9962 steps, -1.9540
cents,
  -0.09086 syntonic commas ( 1/11 ), -0.08329 Pythagorean commas ( 1/12 )
Basic Pyth. third : 34390, 400.0039 cents, diff.  1177.000 steps,  13.6902
cents
Number of recognisable fifths : 2948
Number of recognisable thirds : 8597
Best Pyth. comma  : 2017, Basic Pyth. comma  : 1
Best diesis       : 3530, Basic diesis       : 2
Syntonic comma    : 1849, Basic syntonic comma : 1177
Diaschisma        : 1681, Schisma            : 168
Diatonic semitone : 9606, Chromatic semitone : 7925
Minor tone        : 15682,Minor chroma       : 6076
Pythagorean limma : 7757, Apotome            : 9774, Pyth. whole tone :
17531
R = W / H         : 2.260023 = 17531/7757
Basic R           : 2.000116 = 17195/8597
Hyperoche         : 11791,Eschatum           :-13808
Kleisma           : 697,  Wuerschmidt comma  : 984
Septimal kleisma  : 663,  Septimal comma     : 2344
Septimal diesis   : 4193, Harrison comma     : 4361
Undecimal comma   : 4580, Tridecimal comma   : 5617
Basic third       : 34389, 399.9922 cents, diff.  13.6785 cents
Number of cycles of basic fifths         : 1 of 103169 tones
Number of cycles of best fifths          : 1 of 103169 tones
Number of basic fifths in basic third    : 103161 or -8
Number of basic fifths in best third     : 89049 or -14120
Number of best fifths in basic third     : 64909 or -38260
Number of best fifths in best third      : 46795 or -56374
Number of basic fifths in best seventh   : 71007 or -32162
Number of best fifths in best seventh    : 10214 or -92955
Number of basic fifths in best sixth     : 87032 or -16137
Number of best fifths in best sixth      : 46794 or -56375
Number of basic thirds in best seventh   : 81397 or -21772
Number of best thirds in best seventh    : 81120 or -22049
Number of cycles of basic thirds         : 1 of 103169 tones
Number of cycles of best thirds          : 1 of 103169 tones
Number of basic thirds in basic fifth    : 12896 or -90273
Number of basic thirds in best fifth     : 12644 or -90525
Number of best thirds in basic fifth     : 81315 or -21854
Number of best thirds in best fifth      : 76814 or -26355
Number of cycles of best sevenths        : 1 of 103169 tones
Number of best sevenths in basic fifth   : 102922 or -247
Number of best sevenths in best fifth    : 17646 or -85523
Number of best sevenths in basic third   : 1976 or -101193
Number of best sevenths in best third    : 83063 or -20106
Embedded divisions: 11 83 113 913 1243 9379

Manuel


top of page bottom of page up down


Message: 5981

Date: Wed, 15 Jan 2003 08:45:14

Subject: Re: Notating Pajara

From: Dave Keenan

--- In tuning-math@xxxxxxxxxxx.xxxx "Gene Ward Smith
<genewardsmith@j...>" <genewardsmith@j...> wrote:
> I'm afraid minimax can sometimes involve a range of values, and
that's what's happened here. I was simply feeding it to Maple's
simplex routine and turning the crank, but I need to redo the code so
that the range is returned when there is one.
>

I've been Internet-challenged for the past few days.

I'm glad Paul remembered this common pitfall with max-absolute
calculations.

Standard practice (among tuning folk) when calculating generators
giving the least max-absolute error (minimax) is not to return a range
(musically we don't really care about such ranges) but simply to
eliminate from the optimisation any consonance whose size is
independent of the generator size (like the 5:7 in Pajara). Of course
one must still take the errors of such consonances into account when
quoting the least max-absolute error for the tuning in cents, as
required for comparison with other temperaments.

Previously, in my case at least, the justifaction for this procedure
was purely musical, and so it appeared somewhat ad hoc mathematically.
Thanks Gene for pointing out that least max-absolute corresponds to
least sum-of-p_th-powers-of-absolutes as p -> oo. As Paul pointed out,
this now provides a purely mathematical justification for the standard
procedure.

So Gene, does this now throw into doubt all your previous p-optimal
calculations?


top of page bottom of page up down


Message: 5986

Date: Wed, 15 Jan 2003 18:48:32

Subject: Re: Nonoctave scales and linear temperaments

From: Carl Lumma

>>If we throw out odd-limit, however, we might be better off.
>>If there were a weighting that followed Tenney limit but
>>was steep enough to make near-perfect 2:1s a fact of life
>>and anything much beyond the 17-limit go away, we could
>>have individually-weighted errors and 'limit infinity'.
>
>but there would have to be an infinitely long map, or wedgie,
>or list of unison vectors in order to define the temperament
>family.

Yeah, but we could approximate it with a finite map.

>>We should be able to search map space and assign generator
>>values from scratch.
> 
>i don't understand this.

The number of possible maps I'm interested in isn't that
large.  Gene didn't deny that a map uniquely defined its
generators (still working through how a list of commas
could be more fundamental than a map...).

>as gene expained, we can let the 2:1s fall as they may even
>with the current framework.

What is the current framework?  How have we been searching
for new systems?

>though the choice of what gets defined as "generator" becomes
>arbitrary

I never gave any special significance to the generator
vs. the ie.


>>As far as my combining error and complexity before optimizing
>>generators, that was wrong.  Moreover, combining them at all
>>is not for me.  I'm not bound to ask, "What's the 'best' temp.
>>in size range x?".  Rather, I might ask, "What's the most
>>accurate temperament in complexity range x?".
>
>that's exactly how i've been looking at all this for the entire 
>history of this list -- witness my comments to dave defending
>gene's log-flat badness measures; i took exactly this tack!

How could it defend Gene's log-flat badness?  It's utterly
opposed to it!

>>Which is just a sort on all possible temperaments, first by
>>complexity,
> 
>this is exactly how i proposed that we present the results in
>our paper . . .

Cool.

>>then by accuracy.
>
>well, you'll rarely have two temperaments with the same
>complexity,

Funny, I don't see Dave's 5-limitTemp spreadsheet on his
website, but with Graham complexity, you do get a fair
number of collisions IIRC.

>well, at this point, it's easy enough to sort the 5-limit
>database by complexity, at least complexity as defined by
>my heuristic:

Too bad there's nothing explaining the heuristic.  :(

-C.


top of page bottom of page up down


Message: 5989

Date: Wed, 15 Jan 2003 21:14:05

Subject: Re: Nonoctave scales and linear temperaments

From: Graham Breed

wallyesterpaulrus  wrote:

> we've been searching by commas, i believe -- i'll let gene answer 
> this more fully. graham has been searching by "+"ing ET maps to get 
> linear temperament maps -- something i'm not sure i can explain right 
> now.

I wrote my method up a while back:

How to find linear temperaments *

I've implemented a search by unison vectors as well.  But it isn't as 
efficient if you use an arbitrarily large set of unison vectors, as you 
really should for an exhaustive search.

> that's not what i mean -- i mean, if you're dealing with a planar 
> temperament (which might simply be a linear temperament with 
> tweakable octaves) or something with higher dimension, there's no 
> unique choice of the basis of generators -- gene's used things such 
> as hermite reduction to make this arbitrary choice for him.

Even with linear temperaments, it can be difficult to make the choice 
unique.  I had a lot of trouble with scales with a small period 
generated by unison vectors.  The relative sizes of the possible 
generators can change after you optimize.


                      Graham


top of page bottom of page up down


Message: 5996

Date: Wed, 15 Jan 2003 01:36:49

Subject: Re: Nonoctave scales and linear temperaments

From: Carl Lumma

>if you choose a different set of generators, you'll get
>a different ranking for which mapping is more complex
>then which!

Doesn't a map uniquely determine its generators?

-Carl


top of page bottom of page up down


Message: 5999

Date: Wed, 15 Jan 2003 04:44:36

Subject: Re: Nonoctave scales and linear temperaments

From: Carl Lumma

>>Doesn't a map uniquely determine its generators?
> 
>The problem is that the temperament does not uniquely
>determine the map.

What is a temperament, then, if not a map?

-Carl


top of page bottom of page up

Previous Next Last

5000 5050 5100 5150 5200 5250 5300 5350 5400 5450 5500 5550 5600 5650 5700 5750 5800 5850 5900 5950

5950 - 5975 -

top of page