This is an Opt In Archive . We would like to hear from you if you want your posts included. For the contact address see About this archive. All posts are copyright (c).

- Contents - Hide Contents - Home - Section 3

Previous Next

2000 2050 2100 2150 2200 2250 2300 2350 2400 2450 2500 2550 2600 2650 2700 2750 2800 2850 2900 2950

2150 - 2175 -



top of page bottom of page up down


Message: 2175 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 07:08:10

Subject: Re: Graham's Top Ten

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

> Oh dear . . . how about a couple of 5-limit examples?
In the 5-limit, the wedge product becomes the cross-product. The cross-product of two ets gives the intersection of the kernels, and the cross-product of two commas gives the common et. So, it's not the same. In the 11-limit, the wedge product of two ets gives a 10-dimensional vector which represents a linear temperament, and of two commas a 10- dimensional vector which represents a planar temperament, so we don't have something betwixt and between. Wedging three ets gives a 10- dimensional planar temperament invariant, which can be compared to two commas, and three commas gives a linear temperament invariant, as with two ets. I could go on and describe the 13-limit, but the point is the best way to get a handle on the 7-limit is to look at it, not at the 5-limit!
top of page bottom of page up down


Message: 2176 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 07:09:53

Subject: Re: Graham's Top Ten

From: Paul Erlich

--- In tuning-math@y..., genewardsmith@j... wrote:
> --- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote: >
>> Oh dear . . . how about a couple of 5-limit examples? >
> In the 5-limit, the wedge product becomes the cross-product. The > cross-product of two ets gives the intersection of the kernels, and > the cross-product of two commas gives the common et. So, it's not the > same. > > In the 11-limit, the wedge product of two ets gives a 10- dimensional > vector which represents a linear temperament, and of two commas a 10- > dimensional vector which represents a planar temperament, so we don't > have something betwixt and between. Wedging three ets gives a 10- > dimensional planar temperament invariant, which can be compared to > two commas, and three commas gives a linear temperament invariant, as > with two ets. I could go on and describe the 13-limit, but the point > is the best way to get a handle on the 7-limit is to look at it, not > at the 5-limit!
I'm trying to get an _intuitive_ handle on it . . . sigh . . . better get those math books you mentioned!
top of page bottom of page up down


Message: 2177 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 07:10:12

Subject: Re: List cut-off point

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:
> --- In tuning-math@y..., genewardsmith@j... wrote: >>
>> I'd consider anything, but 2^s sounds ferocious! >
> Fokker evaluated equal temperaments using 2^n and found 31-tET best!
If he'd picked a different exponent he might have found something else to be the best. It's probably pretty easy to make 12 the best.
top of page bottom of page up down


Message: 2178 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 07:15:01

Subject: Re: List cut-off point

From: Paul Erlich

--- In tuning-math@y..., genewardsmith@j... wrote:
> --- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:
>> --- In tuning-math@y..., genewardsmith@j... wrote: >>>
>>> I'd consider anything, but 2^s sounds ferocious! >>
>> Fokker evaluated equal temperaments using 2^n and found 31-tET best! >
> If he'd picked a different exponent he might have found something > else to be the best. It's probably pretty easy to make 12 the best.
My point is that 2^n doesn't go unreasonably far in penalizing large systems.
top of page bottom of page up down


Message: 2179 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 08:12:03

Subject: Re: Graham's Top Ten

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

> I'm trying to get an _intuitive_ handle on it . . . sigh . . . better > get those math books you mentioned!
If you have library priviledges at a university library, the subject area is multilinear algebra, where exterior algebra and bilinear forms are important, and tensor products and differential forms can, let us hope, be safely ignored. However you could simply try to get a feel for it by doing some calculations. Since 64/63 = 2^6 3^-2 7^-1, I might write it 6 [2] - 2 [3] - [7], where the brakets are intended to signal the number is a vector. Then 50/49 = [2] + 2 [5] - 2 [7], and 50/49^64/63 = ([2]-2[5]-2[7])^(6[2]-2[3]-[7]). Taking the wedge product would be great practice...since a^b=b^a, a^a=0, so [2]^[2]=0, [3]^[3]=0, [5]^[5]=0, [7]^[7]=0. On the other hand, [2]^[7]= - [7]^[2], and so forth. Why not treat it as one of your challenges and work it out?
top of page bottom of page up down


Message: 2180 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 08:43:30

Subject: Re: Graham's Top Ten

From: Paul Erlich

--- In tuning-math@y..., genewardsmith@j... wrote:
> --- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote: >
>> I'm trying to get an _intuitive_ handle on it . . . sigh . . . > better
>> get those math books you mentioned! >
> If you have library priviledges at a university library, the subject > area is multilinear algebra, where exterior algebra and bilinear > forms are important, and tensor products and differential forms can, > let us hope, be safely ignored. However you could simply try to get a > feel for it by doing some calculations. Since 64/63 = 2^6 3^-2 7^- 1, > I might write it 6 [2] - 2 [3] - [7], where the brakets are intended > to signal the number is a vector. Then 50/49 = [2] + 2 [5] - 2 [7], > and > > 50/49^64/63 = ([2]-2[5]-2[7])^(6[2]-2[3]-[7]). Taking the wedge > product would be great practice...since a^b=b^a, a^a=0, so [2]^[2] =0, > [3]^[3]=0, [5]^[5]=0, [7]^[7]=0. On the other hand, > [2]^[7]= - [7]^[2], and so forth. Why not treat it as one of your > challenges and work it out?
What will this tell me _intuitively_? P.S. look over at harmonic_entropy if you haven't lately.
top of page bottom of page up down


Message: 2181 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 09:10:56

Subject: Re: Graham's Top Ten

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

> What will this tell me _intuitively_?
You need to start from knowing what something is before developing the intuition.
top of page bottom of page up down


Message: 2182 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 09:52:22

Subject: Re: List cut-off point

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

> Did you have a problem with the (s^2)*c or whatever rule that graham > used?
When I use a s^2 c rule, I still get ennealimmal coming out on top.
> Would you consider (2^s)*c?
I tried that, and the best system turned out to be <49/48,25/24>, which is below your lower cut, and it gives me the impression that this is too extreme in the other direction. Best and worst were: <25/24,49/48> (Pretty much 6+4=10) 2^c s = 118.69 <2048/2025,4375/4374> (46,80,92 ets) 2^c s = 1.01 x 10^7.
top of page bottom of page up down


Message: 2183 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 10:25:22

Subject: Re: List cut-off point

From: Paul Erlich

--- In tuning-math@y..., genewardsmith@j... wrote:
> --- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote: >
>> Did you have a problem with the (s^2)*c or whatever rule that > graham >> used? >
> When I use a s^2 c rule, I still get ennealimmal coming out on top.
What if you were to consider more complex unison vectors? You'd just keep finding better and better ones, wouldn't you?
>> Would you consider (2^s)*c? >
> I tried that, and the best system turned out to be <49/48,25/24>, > which is below your lower cut,
How did that system get in there in the first place? Aren't there even better systems according to this criterion, say ones using 21/20 as a unison vector?
> and it gives me the impression that > this is too extreme in the other direction.
What else makes the top 10 with this criterion?
> Best and worst were: > > <25/24,49/48> (Pretty much 6+4=10) > > 2^c s = 118.69 > > <2048/2025,4375/4374> (46,80,92 ets) > > 2^c s = 1.01 x 10^7.
You mean 2^s c, not 2^c s?
top of page bottom of page up down


Message: 2184 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 10:29:33

Subject: Re: List cut-off point

From: Paul Erlich

What about something like

2^(s/3)*c?


top of page bottom of page up down


Message: 2185 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 10:50:15

Subject: Re: List cut-off point

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:
> --- In tuning-math@y..., genewardsmith@j... wrote:
>> --- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote: >> When I use a s^2 c rule, I still get ennealimmal coming out on top.
> What if you were to consider more complex unison vectors? You'd just > keep finding better and better ones, wouldn't you?
It certainly looks that way, but I haven't even thought about a proof.
>>> Would you consider (2^s)*c? >>
>> I tried that, and the best system turned out to be <49/48,25/24>, >> which is below your lower cut, >
> How did that system get in there in the first place? Aren't there > even better systems according to this criterion, say ones using 21/20 > as a unison vector?
No doubt, but I started with cut-off in my list of generating commas of 49/48.
> You mean 2^s c, not 2^c s? Sorry.
Partial results for s^2c on my list of 66 temperaments are: 1. Ennealimmal, 33.46 2. <2401/2400, 3136/3125> system, 89.14 3. Miracle, 94.70 ... 65. [0 12] [0 19] [-1 28] [-1 34] 975.31 66. <2048/2025,4375/4374>, 1187.06
top of page bottom of page up down


Message: 2186 - Contents - Hide Contents

Date: Sun, 02 Dec 2001 10:56:09

Subject: Re: List cut-off point

From: Paul Erlich

Can you come up with a goodness measure for linear temperaments just 
as you did for ETs?


top of page bottom of page up down


Message: 2187 - Contents - Hide Contents

Date: Mon, 03 Dec 2001 20:16:02

Subject: Re: List cut-off point

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

> So it's the best 7-limit linear temperament ever discovered according > to steps^2 cents?
I've got a new list of 505 temperaments, we'll see if anything beats it. However, it is what you might call a "capstone temperament" in the 7-limit. That is, it uses the two smallest superparticular commas to define itself. Similarly, meantone would be a linear temperament capstone in the 5-limit, and <3025/3024,4375/4374,9801/9800> a capstone for the 11-limit. Since 7 is the higher of a prime pair (5 and 7) its smallest superparticular commas are relatively small, so I suppose capstone temperaments in the 13-limit would be more to the point, and maybe planar or 3D more than linear.
top of page bottom of page up down


Message: 2188 - Contents - Hide Contents

Date: Mon, 03 Dec 2001 01:02:20

Subject: Re: List cut-off point

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

> Can you come up with a goodness measure for linear temperaments just > as you did for ETs?
Here's an estimate: if we have d odd primes, and two good ets of about size n, then the error in relative cents is on the order of n^(-1/d), which from the connection I showed between this and generator steps means that number of steps is on the order of n^(1-1/d). Since the error in relative cents is O(n^(-1/d)) for the ets, the error in absolute cents is O(n^(-1-1/d)). The shift from ets to linear temperaments leads to some improvement, so the error goes down, but it doesn't seem to go down a great deal. We end up with an estimate of (n^(1-1/d))^b * n^(-1-1/d) for step^b cents; this gives exponent zero for b = (d+1)/(d-1). This is infinity for d=1, where we have chains of perfect fifths, 3 for d=2 (5-limt), 2 for d=3 (7- limit), 5/3 for the 11-limit and so forth. I conclude that we can expect an infinite number of linear temperaments under some fixed limit in step^2 cents in the 7-limit, which seems to be the case. I suspect we will get no such thing for step^3 cents, since the improvement to linear temperaments seems unlikely to give that much. Here is what I got from my list of 66 temperaments for step^3 cents. Notice that ennealimmal temperament still does pretty well! (1) [2,3,1,-6,4,0] <27/25,49/48> ets: 4,5,9,14 measure: 210.36 (2) [4,2,2,-1,8,6] <25/24,49/48> ets: 4,6,10 measure: 294.93 (3) [4,4,4,-2,5,-3] <36/35,50/49> ets: 4,12,16,28 measure: 433.02 (4) [18,27,18,-34,22,1] <2401/2400,4375/4374> ets: 27,72,99,171,270,441,612 measure: 535.89 The first to pass Paul's lower cut is *still* ennealimmal. (5) [6,5,3,-7,12,-6] <49/48,126/125> ets: 4,15,19,34 measure: 642.89 ... (65) [12,12,-6,-19,19] <50/49,3645/3584> ets: 12, 48 measure: 9556.05 Fails Paul's strong TM condition. ets: 12,48 Map: [ 0 12] [ 0 19] [-1 28] [-1 34] (66) [2,-4,30,81,-42,-11] <2048/2025,4375/4374> ets: 46, 80 measure: 26079.25 What do you think? This one might work.
top of page bottom of page up down


Message: 2189 - Contents - Hide Contents

Date: Mon, 03 Dec 2001 20:27:33

Subject: Re: List cut-off point

From: Paul Erlich

--- In tuning-math@y..., genewardsmith@j... wrote:

> Since 7 is the higher of a prime pair > (5 and 7) its smallest superparticular commas are relatively small, > so I suppose capstone temperaments in the 13-limit would be more to > the point, and maybe planar or 3D more than linear.
Can you explain this sentence? I don't understand it at all.
top of page bottom of page up down


Message: 2190 - Contents - Hide Contents

Date: Mon, 03 Dec 2001 02:53:16

Subject: some tetrachordality results

From: Carl Lumma

All;

I've implemented in scheme some of the stuff regarding the
generalized tetrachordality measure that went around a
while back.

My procedure finds the mean absolute deviation of a scale
from its transposition at 702 cents, first in scale order,
then in any order.  Specifically, I do a brute force note-
to-note compare with all rotations of the notes of the
transposed scale in the former case, permutations in the
latter case, and return the minimum for each (source available).
Scales in ()s are degrees of 12-tET.  Values are mean cents
deviation, rounded to the nearest cent.

Pentatonic Scale
                 (0 2 5 7 9) = 21, 21
       1/1 9/8 4/3 3/2 27/16 = 18, 18

Diatonic Scale
            (0 2 4 5 7 9 11) = 16, 16
1/1 9/8 5/4 4/3 3/2 5/3 15/8 = 16, 16

Diminished chord
                   (0 3 6 9) = 102, 102

Wholetone scale
              (0 2 4 6 8 10) = 102, 102

Diminished scale
          (0 2 3 5 6 8 9 11) = 50, 50
          (0 1 3 4 6 7 9 10) = 102, 102

Minor scales w/'gypsy' tetrachord
            (0 1 4 5 7 8 11) = 44, 44
            (0 1 4 5 7 8 10) = 44, 44
            (0 1 4 5 7 9 10) = 44, 44

The issues I'd like to bring to your attention are:

() Anybody care to compare results, or see
any obvious bugs?  Debugging always appreciated.

() Different modes of a scale will give different
results, as with the diminished scale above.  I
consider all "rotations", but I do not normalize
the rotations to their tonics to get modes.  Maybe
I should?  IOW, would we think omnitetrachordality
could be used find stable modes of a scale?

() Notice that the allowing permutations (2nd set of
values) never lowers the score.  So I either have a
bug, or I don't need to fuss with permutations.
Normally, we could lower the meandev. between these
two...

(1 2 3 10 11 12)
(1 10 2 11 3 12)

By allowing permutations, vs. enforcing neighboring
order.  However, maybe because I'm restricting myself
to transpositions, this sort of situation never arises.
Sound reasonable?

-Carl


top of page bottom of page up down


Message: 2191 - Contents - Hide Contents

Date: Mon, 3 Dec 2001 21:09 +00

Subject: 21 note CS lattice

From: graham@xxxxxxxxxx.xx.xx

Here's a lattice for Paul's 21 note constant structure.

                9^------5
               / \     /
            1 / 4 \ 7 / 0v------6v
             /     \ /
    0---3---6---9---2v--5v--8v
     \     / \     /
2     \ 8 / 1v\ 4v/ 7v
       \ /     \ /
        3v------9v

3, 9 and 5v are the only intervals of 11.  Everything else is at the 
simplest 7-prime limit location.  Here's the decimal translation:

6   1/1
7v 21/20     7  15/14
8v  9/8      8   8/7
9v  6/5      9  11/9      9^  5/4
0v 21/16     0   4/3
1v  7/5      1  10/7
2v  3/2      2  32/21
3v  8/5      3  18/11
4v 12/7      4   7/4
5v 11/6      5  15/8
6v 63/32     6   2/1


                     Graham


top of page bottom of page up down


Message: 2192 - Contents - Hide Contents

Date: Mon, 03 Dec 2001 04:56:25

Subject: Re: List cut-off point

From: Paul Erlich

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:
> What about something like > > 2^(s/3)*c?
Oops -- I was actually thinking 2^(s^(1/3))*c . . . How does that look, Gene?
top of page bottom of page up down


Message: 2193 - Contents - Hide Contents

Date: Mon, 03 Dec 2001 05:10:09

Subject: Re: List cut-off point

From: Paul Erlich

--- In tuning-math@y..., genewardsmith@j... wrote:
> --- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote: >
>> Can you come up with a goodness measure for linear temperaments > just
>> as you did for ETs? >
> Here's an estimate: if we have d odd primes, and two good ets of > about size n, then the error in relative cents is on the order of > n^(-1/d), which from the connection I showed between this and > generator steps means that number of steps is on the order of > n^(1-1/d). Since the error in relative cents is O(n^(-1/d)) for the > ets, the error in absolute cents is O(n^(-1-1/d)). The shift from ets > to linear temperaments leads to some improvement, so the error goes > down, but it doesn't seem to go down a great deal.
Except in cases like ennealimmal, huh?
> We end up with an > estimate of (n^(1-1/d))^b * n^(-1-1/d) for step^b cents; this gives > exponent zero for b = (d+1)/(d-1). This is infinity for d=1, where we > have chains of perfect fifths, 3 for d=2 (5-limt), 2 for d=3 (7- > limit), 5/3 for the 11-limit and so forth. > > I conclude that we can expect an infinite number of linear > temperaments under some fixed limit in step^2 cents in the 7-limit, > which seems to be the case. I suspect we will get no such thing for > step^3 cents, since the improvement to linear temperaments seems > unlikely to give that much.
Does step^3 cents come out of one of the considerations above, or is it merely an arbitrary function that increases more rapidly than step^2 cents?
> Here is what I got from my list of 66 > temperaments for step^3 cents. Notice that ennealimmal temperament > still does pretty well! > > (1) [2,3,1,-6,4,0] > <27/25,49/48> ets: 4,5,9,14 measure: 210.36 > > (2) [4,2,2,-1,8,6] > <25/24,49/48> ets: 4,6,10 measure: 294.93 > > (3) [4,4,4,-2,5,-3] > <36/35,50/49> ets: 4,12,16,28 measure: 433.02 > > (4) [18,27,18,-34,22,1] > <2401/2400,4375/4374> ets: 27,72,99,171,270,441,612 measure: 535.89
And you don't think an infinite number of even better ones would be found if we moved out further in the lattice? I really would like to cut the cord to the original list of unison vectors, if possible.
top of page bottom of page up down


Message: 2194 - Contents - Hide Contents

Date: Mon, 03 Dec 2001 05:38:16

Subject: Re: List cut-off point

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

The shift from 
> ets
>> to linear temperaments leads to some improvement, so the error > goes
>> down, but it doesn't seem to go down a great deal. >
> Except in cases like ennealimmal, huh?
Well, maybe it doesn't except when it does. :) If you are suggesting the actual critical exponent is going to be higher than 2, because of the improvement of linear temperament over ets, then I would not be surprised, but getting it all the way up to 3 is another matter, I think. The critical exponent is *at least* 2, however.
> Does step^3 cents come out of one of the considerations above, or is > it merely an arbitrary function that increases more rapidly than > step^2 cents?
It's an arbitary function of polynomial growth--using exponential growth I suspect to be overkill.
> And you don't think an infinite number of even better ones would be > found if we moved out further in the lattice?
All I've really argued for is that this will, indeed, happen if we use steps^2; I *think* steps^3 will kill it off eventually. I really would like to
> cut the cord to the original list of unison vectors, if possible.
On that note, I made up a list of 990 pairs of ets which I can use to seed things, but I'm waiting to see what to use for a cut-off.
top of page bottom of page up down


Message: 2195 - Contents - Hide Contents

Date: Mon, 03 Dec 2001 06:02:58

Subject: Re: List cut-off point

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

> Except in cases like ennealimmal, huh?
I've been looking at some extreme examples, concocted from high- powered ets in the 1000-10000 range, and it seems likely that 2 actually is the critical exponent, and pretty well certain that even if it isn't using steps^3 will lead to a finite list. Ennealimmal may just be exceptionally good.
top of page bottom of page up down


Message: 2196 - Contents - Hide Contents

Date: Mon, 03 Dec 2001 13:44:32

Subject: Re: List cut-off point

From: Paul Erlich

--- In tuning-math@y..., genewardsmith@j... wrote:
> --- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote: >
>> Except in cases like ennealimmal, huh? >
> I've been looking at some extreme examples, concocted from high- > powered ets in the 1000-10000 range, and it seems likely that 2 > actually is the critical exponent, and pretty well certain that even > if it isn't using steps^3 will lead to a finite list. Ennealimmal may > just be exceptionally good.
So it's the best 7-limit linear temperament ever discovered according to steps^2 cents?
top of page bottom of page up down


Message: 2197 - Contents - Hide Contents

Date: Tue, 04 Dec 2001 20:38:31

Subject: Re: List cut-off point

From: genewardsmith@xxxx.xxx

--- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote:

> Can you explain this sentence? I don't understand it at all.
It's simply conjecture on my part that the higher of a pair of twin primes should have a comparitively larger largest superparticular ratio associated to it than the lower, but I think I'll investigate it as a question in number theory.
top of page bottom of page up down


Message: 2198 - Contents - Hide Contents

Date: Tue, 04 Dec 2001 21:26:53

Subject: Re: List cut-off point

From: Paul Erlich

--- In tuning-math@y..., genewardsmith@j... wrote:
> --- In tuning-math@y..., "Paul Erlich" <paul@s...> wrote: >
>> Can you explain this sentence? I don't understand it at all. >
> It's simply conjecture on my part that the higher of a pair of twin > primes should have a comparitively larger largest superparticular > ratio associated to it than the lower,
Assuming this is true, can you explain the sentence?
top of page bottom of page up down


Message: 2199 - Contents - Hide Contents

Date: Wed, 05 Dec 2001 04:33:04

Subject: Re: Top 20

From: paulerlich

Would you do the 5-limit too, that would be so cool!


top of page bottom of page up

Previous Next

2000 2050 2100 2150 2200 2250 2300 2350 2400 2450 2500 2550 2600 2650 2700 2750 2800 2850 2900 2950

2150 - 2175 -

top of page