This is an Opt In Archive . We would like to hear from you if you want your posts included. For the contact address see About this archive. All posts are copyright (c).

- Contents - Hide Contents - Home - Section 4

Previous Next

3000 3050 3100 3150 3200 3250 3300 3350 3400 3450 3500 3550 3600 3650 3700 3750 3800 3850 3900 3950

3300 - 3325 -



top of page bottom of page up down


Message: 3300 - Contents - Hide Contents

Date: Wed, 16 Jan 2002 23:19:24

Subject: Re: Hi Dave K.

From: paulerlich

--- In tuning-math@y..., "dkeenanuqnetau" <d.keenan@u...> wrote:
> --- In tuning-math@y..., "paulerlich" <paul@s...> wrote:
>> Dave, perhaps you can help with this Kees lattice business. It's > pure
>> math, but Gene appears to have no interest in it. It seems to me > that
>> we should be able, based on Kees's lattice with a 'taxicab' metric, >> be able to define an error function so that > ... >
> Sorry. No time. But here's how I'd approach it. Set up a spreadsheet > with the ten examples in 10 rows. Columns would be n and d of ratio to > be tempered out, and for each (octave-equivalent) interval the error > in cents and the number of generators required, then the errror and > gens functions of n and d that you propose. Then I'd try a weighted > rms error and a weighted rms gens where the weights can be changed for > each interval (indep. wts for err and gens). I'd calculate the > least-squares difference between the weighted rms results and your > proposed functions of n and d. Then I'd fool around with the error > weights to minimise the error in the errors and fool around with the > gens weights to minimise the error in the gens. I hope this makes > sense.
Thanks, but I don't think RMS will work. That implies a Euclidean metric, but a "taxicab" metric seems to be what we want here.
top of page bottom of page up down


Message: 3301 - Contents - Hide Contents

Date: Wed, 16 Jan 2002 00:59:54

Subject: Re: algorithm sought

From: paulerlich

--- In tuning-math@y..., "clumma" <carl@l...> wrote:
>>>> ||3^a 5^b 7^c|| = sqrt(a^2 + 4b^2 + 4c^2 + 2ab + 2ac + 4bc) >>>> >>>> would be the length of 3^a 5^b 7^c. Everything in a radius of 2 >>>> of anything will be consonant. >>>
>>> What happens to 9:5 and 9:7? >>
>> ||9/5|| = ||3^2 5^(-)|| = sqrt(4+4+0-4+0+0) = 2 >> >> ||9/7|| = ||3^2 7^(-1)|| = sqrt(4+0+4+0-4+0) = 2 >> >> Hence both 9/5 and 9/7 are consonant with 1. >
> But they should have the same distance from 1 as 3/2, > or any other tonality-diamond pitch, no?
Why? Certainly if we were picky about it, we'd want a precise length to be associated with each interval, and whether these are equal or un-equal, we'd almost certainly be in non-Euclidean space pretty fast. But Gene has simply found a handy mathematical formula for asking the binary question "is a certain interval o-limit consonant" - - that was the context in which he brought this up.
> > -Carl
top of page bottom of page up down


Message: 3302 - Contents - Hide Contents

Date: Wed, 16 Jan 2002 23:45:09

Subject: Re: metric visualization

From: paulerlich

--- In tuning-math@y..., "keesvp" <kees@d...> wrote:

> Let f3 and f5 be the errors. > If they are of equal sign: > error = max( |f3*log(5)|, |f5*log(3)| ) > else: > error = |f3*log(5)| + |f5*log(3)| > > I'm afraid I'm being much more intuitive than scientific here. In the > normal meantone case it gives regular 1/4 comma anyway as far as I > can see
Kees, I don't think it gives 1/4-comma meantone. In the vicinity of 1/4-comma meantone, since the errors are the same sign, error = max( |f3*log(5)|, |f5*log(3)| ) But f3 and f5 are equal in 1/4-comma meantone, while this error function is actually minimized with a fifth of 698.01 cents. However, I don't think your original formula makes much sense. Shouldn't "if they are of equal sign" be replaced with "if they are of opposite sign"? In any case, if I directly use the formula you use to evaluate ETs on Searching Small Intervals * [with cont.] (Wayb.): f3 + log(3)/log(5)*f5 + log(3)/log(5)*f(5/3) intepret f3, f5, and f(5/3) as absolute values, and minimize this sum, I do get 1/4-comma meantone. But if I take the MAX instead of the sum of these three terms, I get a fifth of 697.1160 cents. This is quite nice, and might belong on the table of optimal meantones. Is this the "Kees optimal meantone" corresponding to my triangular lattice in lattice orientation * [with cont.] (Wayb.) ? If so, what error function would we use to calculate the "Kees optimal meantone" corresponding to _your_ triangular lattice, the one right after mine, in lattice orientation * [with cont.] (Wayb.) ? I'd be pleased if you gave me a bunch of guesses -- the more I have to try out, the better.
top of page bottom of page up down


Message: 3303 - Contents - Hide Contents

Date: Wed, 16 Jan 2002 01:09:28

Subject: [tuning] Re: badly tuned remote overtones

From: paulerlich

--- In tuning-math@y..., "monz" <joemonz@y...> wrote:
>
>> From: paulerlich <paul@s...> >> To: <tuning-math@y...> >> Sent: Tuesday, January 15, 2002 3:42 PM >> Subject: [tuning-math] [tuning] Re: badly tuned remote overtones >> >> >> --- In tuning-math@y..., "monz" <joemonz@y...> wrote: >>>
>>> I think one would *have* to include a 5-limit "enharmonic >>> unison-vector" here, since Schoenberg explicitly equated A#=Bb, >>> C#=Db, D#=Eb, etc., the usual 12-EDO enharmonically-equivalent >>> stuff. >>
>> Did he do this explicitly within any of the 'constructions >> of unison vectors' you gleaned from him? > >
> Well, not specifically *this* interval. But according to his > notational usage, *any* of the 5-limit enharmonicities should apply.
Right, but . . . did he apply any of them explicitly within any of the 'constructions of unison vectors' you gleaned from him? Otherwise, you're just "assuming the answer".
>
>> And anyway, why not 128:125? Seems simpler . . . > >
> OK, Paul, I tried 128:125 in place of 2048:2025, and the > inverse I get is: > > [ 12 7 12 0 -9 ] > [ 19 11 19 0 -14 ] > [ 28 16 28 0 -21 ] > [ 34 20 34 -1 -26 ] > [ 41 24 42 0 -31 ] > > > So you're right ... this still shows the inconsistent > mapping to 11 in h12(11)=41, g12(11)=42. Naturally, since > I only replace one row of the UV-matrix, there's only > one column of the inverse that's different (see? ... I really > *am* learning this stuff!), and that's the last column. > > So, I asked before ... what do these other columns mean? > What's -h9 showing us that's different from the -h2 in > my other matrix? > > [ -2 ] > [ -3 ] > [ -5 ] > [ -6 ] > [ -7 ] > > And what about that h0 column? What does that mean?
I'll leave these questions to Gene . . .
> > > I also tried plugging the skhisma into the "5-limit > enharmonicity" row of the matrix (other than 81:80, that is). > This time, the last column of the inverse reads: > > [ -5 ] > [ -8 ] > [ -11 ] > [ -14 ] > [ -17 ] > > and all other columns are the same as the two I derived > before, but with the signs reversed. If I use the complement > of the skhisma, the other four rows are the same as before > and the last one is as above but all positive. > > > If I assume what is probably the most basic case, and > plug the Pythagorean comma into that row,
??? Why is that the most basic case?
> So, now it seems that I've found the inconsistency in > Schoenberg's mapping of 5 as well.
Only if you assume the Pythagorean comma, right?
> >
>>> So now, does that mean that *this* periodicity-block >>> is the prime candidate for the one Schoenberg probably >>> had in mind? >>
>> What do you mean, *this* periodicity block? How many notes does it >> contain? You only reported a determinany of 1, but that wasn't for a >> PB, that was for a "notation" with one "extra" unison vector relative >> to what would be needed for a PB. > >
> Yikes! All too true. So then, what relevance does my construction > have, if any at all? It seems to me to show the mechanics of > Schoenberg's notational inconsistency. Right. > Can you or someone else clear up this business about the difference > between PBs and "notations"? Is it that prime-factor 2 is left out > of PB calculations (assuming "8ve"-equivalency) but must be included > for "notation"?
You need to include the prime-factor 2 for PB calculations too, if you're to weed out cases of torsion. As for "notation", I suggest you ask Gene.
> Paul, I understand the criticisms you've written about what > I'm trying to do here, but I still think the effort is worthwhile. > > Certainly, Schoenberg's "pantonal" [= atonal] style assumed > that any of the 12-EDO pitches could be used equally well as > the center of its own tonal universe. It would be informative > to see how each of those universes may be modeled conceptually, > and how they relate to each other.
Any of the PBs that give you a determinant of 12, if all the unison vectors are tempered out, implies 12-tET. Geometrically, this will be modeled by a torus or hyper-torus . . . can you make out the inflatable torus model in the photocopy of the Hall article I sent you (sorry the photocopy didn't come out so good -- check your library for a better version)?
top of page bottom of page up down


Message: 3304 - Contents - Hide Contents

Date: Wed, 16 Jan 2002 23:49:30

Subject: Re: adding/subtracting unison-vectors

From: paulerlich

--- In tuning-math@y..., "monz" <joemonz@y...> wrote:
> Question: > > What significance is there in the addition or subtraction > of unison-vectors? In either case, the result is another > unison-vector which may be substituted in the matrix in > place of either of those two, correct?
This is true, and it's a rather elementary operation in linear algebra. Gene?
top of page bottom of page up down


Message: 3305 - Contents - Hide Contents

Date: Wed, 16 Jan 2002 23:51:07

Subject: Re: algorithm sought

From: paulerlich

--- In tuning-math@y..., "genewardsmith" <genewardsmith@j...> wrote:
> --- In tuning-math@y..., "paulerlich" <paul@s...> wrote: >
>> I'm concerned that this won't always work -- Gene, perhaps you could >> post the implied lengths for >
> As I said, you need to decide for anything above the 7-limit what
to do about prime powers.
> > Here are 7-limit lengths: > >> 1:3 >> 1:5 >> 3:5 >> 1:7 >> 3:7 >> 5:7 > > All one. > > Now let us suppose ||3||=1 and ||5||=||7||=||9||=||11||=2, then > >> 1:9 >> 5:9 >> 7:9 >> 1:11 > > All 2. > >> 3:11 > > sqrt(3) > >> 5:11 >> 7:11 >> 9:11 > > All 2.
How about 1:15? What is the shortest "dissonant" interval?
top of page bottom of page up down


Message: 3306 - Contents - Hide Contents

Date: Wed, 16 Jan 2002 01:50:15

Subject: Re: algorithm sought

From: genewardsmith

--- In tuning-math@y..., "paulerlich" <paul@s...> wrote:

> I'm concerned that this won't always work -- Gene, perhaps you could > post the implied lengths for
As I said, you need to decide for anything above the 7-limit what to do about prime powers. Here are 7-limit lengths:
> 1:3 > 1:5 > 3:5 > 1:7 > 3:7 > 5:7 All one.
Now let us suppose ||3||=1 and ||5||=||7||=||9||=||11||=2, then
> 1:9 > 5:9 > 7:9 > 1:11 All 2. > 3:11 sqrt(3) > 5:11 > 7:11 > 9:11 All 2.
top of page bottom of page up down


Message: 3307 - Contents - Hide Contents

Date: Wed, 16 Jan 2002 02:06:38

Subject: Re: algorithm sought

From: genewardsmith

--- In tuning-math@y..., "paulerlich" <paul@s...> wrote:

> I'm very impressed, Gene! And there's a Euclidean model for this?
This is based on a positive-definite quadratic form, so it *is* a Euclidean model.
> Will it break down in, say, the 15-limit?
Let o be an odd limit, and suppose q_i are the largest odd prime powers for each prime <= o. We may write any "2-unit" rational number (meaning one with odd numerator and denominator) uniquely in the form r = q_1^e_1 ... q_k^e_k. Then if we define a quadratic form on the exponents by A(r) = \sum_{i <= j} e_i e_j we have A(q_i) = 1, A(q_i/q_j) = 1 Since the number of coefficients of A is k choose 2, this defines A uniquely. We then can define the canonical o-limit length of any 2-unit rational number, and therefore of any octave equivalence class, by ||r|| = sqrt(A(r)) Since A is positive definite (it is in fact a well-known such form in mathematics) it defines a Euclidean distance. If we like, we may adjust matters by multiplying through by the lcm of the exponents of the largest prime powers, so as to be able to work with integers. This sort of thing is what I meant when I said I came upon hexanies and the like geometrically. This metric is useful partly because two octave equivalence classes separated by a distance of one or less are o-consonant, and by a distance of greater than one are o-dissonant.
top of page bottom of page up down


Message: 3308 - Contents - Hide Contents

Date: Wed, 16 Jan 2002 02:22:10

Subject: Re: algorithm sought

From: clumma

>>>> >hat happens to 9:5 and 9:7? >>>
>>> ||9/5|| = ||3^2 5^(-)|| = sqrt(4+4+0-4+0+0) = 2 >>> >>> ||9/7|| = ||3^2 7^(-1)|| = sqrt(4+0+4+0-4+0) = 2 >>> >>> Hence both 9/5 and 9/7 are consonant with 1. >>
>> But they should have the same distance from 1 as 3/2, >> or any other tonality-diamond pitch, no? >
>Why? Certainly if we were picky about it, we'd want a precise >length to be associated with each interval, and whether these are >equal or un-equal, we'd almost certainly be in non-Euclidean space >pretty fast. But Gene has simply found a handy mathematical >formula for asking the binary question "is a certain interval >o-limit consonant" - that was the context in which he brought this >up.
Okay, I guess I was just thinking too taxicab-ish. But I'm a long way from understanding this stuff. What can it do that a taxicab metric can't? -Carl
top of page bottom of page up down


Message: 3309 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 00:09:29

Subject: ERROR IN CARTER'S SCHOENBERG (Re: badly tuned remote overtones)

From: paulerlich

--- In tuning-math@y..., "monz" <joemonz@y...> wrote:

> Therefore, the total list > of unison-vectors implied by Schoenberg's 1911 diagram is: > > Bb 11*4=44 : Bb 7*6=42 = 22:21 > F 16*4=64 : F 7*9=63 = 64:63 > F 11*6=66 : F 16*4=64 = 33:32 > F 11*6=66 : F 7*9=63 = 22:21 > A 9*9=81 :(A 20*4=80) = 81:80 > C 11*9=99 :(C 24*4=96) = 33:32
OK . . .
> > > But because 22:21, 33:32, and 64:63 form a dependent triplet > (any one of them can be found by multiplying the other two), > this does not suffice to create a periodicity-block, which > needs another independent unison-vector. > So I will give Paul Erlich the benefit of the doubt > and assume that Schoenberg was following tradition as > closely as possible in his note naming, by thinking in > terms of meantone, and therefore the most likely candidate > for the other independent UV is the 5-limit diesis > 128:125 = [ 7 0 -3 ] .
This is not like anything I would say. Giving me the benefit of the doubt, huh? This reminds me of where you recently told Klaus that you disagree with me about sustained notes changing their intonation . . . if your interpretations of Schoenberg and other theorists are as good as your interpretation of me . . . well, whatever, I love you Monz, you get a big ice cream with a cherry on top. You better tell the tuning list if you believe that Partch's criticism of Schoenberg was based on a mistranslation!!!
top of page bottom of page up down


Message: 3310 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 06:59:12

Subject: Re: metric visualization

From: paulerlich

I wrote,

> In any case, if I directly use the formula you use to evaluate ETs on > Searching Small Intervals * [with cont.] (Wayb.): > > f3 + log(3)/log(5)*f5 + log(3)/log(5)*f(5/3) > > intepret f3, f5, and f(5/3) as absolute values, and minimize this > sum, I do get 1/4-comma meantone. But if I take the MAX instead of > the sum of these three terms, I get a fifth of 697.1160 cents. This > is quite nice, and might belong on the table of optimal meantones. > > Is this the "Kees optimal meantone" corresponding to my triangular > lattice in lattice orientation * [with cont.] (Wayb.) ?
If so, it might be the one (or one) we want to consider for our paper. But just to test how close it comes to my heuristic: meantone: heuristic error = 0.00280938; optimized error = 0.00649007; ratio = 0.43287 enneadecal: heuristic error = 5.31406e-5; optimized error = 1.35666e- 4; ratio = 0.39170 Still not as good as it should be. What if I try the weights off Kees' here-posted matrix directly: 0.5 * sqrt(3) * log(5) for f3; 0.5 * log(5) for f(5/3); log(3) for f5. Then I get meantone: heuristic error = 0.00280938; (optimal generator = 697.417¢;) optimized error = 0.00529287; ratio = 0.53078 enneadecal: heuristic error = 5.31406e-5; optimized error = 9.93739e- 5; ratio = 0.53475 Now that's more like it! If I use 80 instead of 81 for d in the meantone one, I get a ratio of 0.539 instead if .531 -- so we seem to have perfect agreement within the vagaries of the heuristic! Now it's time to try one more example -- how about the 5-limit MAGIC system? The UV is 3125:3072 . . . magic: heuristic error = 0.002107568; (optimal generator = 380.791¢;) optimized error = 0.005056214; ratio = 0.41682 :( This is so sad! Who can fix this?
top of page bottom of page up down


Message: 3311 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 00:22:39

Subject: Re: algorithm sought

From: clumma

>> >ool! One thing it will allow us to do is subtract out the >> natural JI chords and leave only magic chords in your lists >> on the main list. >
>How's this as a method: using the standard o-limit metric, take >everything in a radius of 1 of the unison, which should give you >the o-limit diamond. Now take all subsets of size k, find the >centroid by averaging the coordinates (which should be in the >prime-power basis, so that in the 9-limit 5/3 would be >9^(-1/2) * 5^1 * 7^0 = [-1/2, 1, 0], for instance) and test if >everything is within a radius of 1/2 of the centroid, in which >case put it on your list. For larger values of o, this would be >faster than simply testing for pairwise consonance.
Okay, thanks. To answer your question, I'm going to have to do some homework. Anyone else is welcome to beat me to it! -Carl
top of page bottom of page up down


Message: 3312 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 07:19:05

Subject: Re: metric visualization

From: paulerlich

--- In tuning-math@y..., "paulerlich" <paul@s...> wrote:

> What if I try the weights off Kees' here-posted matrix directly: > > 0.5 * sqrt(3) * log(5) for f3; > 0.5 * log(5) for f(5/3); > log(3) for f5.
As that didn't work, I'll try the weights in the other matrix Kees posted: 0.91024 for f3; 0.57735 for f5; 0.52553 for f(5/3). I'll use the geometric mean of n and d in place of d in the denominator of the heuristic: |n-d|/(gm*log(gm)). Meantone: optimal generator = 697.415¢ . . . ratio = 0.81962 Magic: opt. gen. = 380.508¢ . . . ratio = 0.760549 Back to the drawing board . . .
top of page bottom of page up down


Message: 3313 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 00:56:48

Subject: Re: algorithm sought

From: genewardsmith

--- In tuning-math@y..., "paulerlich" <paul@s...> wrote:

> How about 1:15?
The 7-limit quadratic form is A7(3^a 5^b 7^c) = a^2 + b^2 _+ c^2 + a*b + a*c + b*c If I substitute a/2 for a in this, I get what I just called (with more optimism than accuracy) the "standard" o-limit form, A9(3^a 5^b 7^c) = 1/4(a^2+4*b^2+4*c^2+2*a*b+2*a*c+4*b*c) Clearing the denominators so as to be able to work only with integers gives us the equivalent B9(3^a 5^b 7^c) = a^2+4*b^2+4*c^2+2*a*b+2*a*c+4*b*c If I plug 15 = 3^1 5^1 7^0 into this, I get 7, so the length is sqrt(7). What is the shortest "dissonant" interval? The theta series of the above quadratic form, defined as sum from -infinity to infinity of i, j, and k of q^B9(i,j,k) is the q-series Th9(q)=1+2*q+4*q^3+12*q^4+4*q^5+8*q^7+6*q^8+6*q^9+4*q^11+24*q^12+ 12*q^13+8*q^15+12*q^16+8*q^17+12*q^19+24*q^20+8*q^21+8*q^23+ 8*q^24+14*q^25+16*q^27+48*q^28+4*q^29+16*q^31+6*q^32+16*q^33+ 8*q^35+36*q^36+20*q^37+8*q^39+24*q^40+8*q^41+20*q^43+24*q^44+ 20*q^45+16*q^47+24*q^48+18*q^49+ ... This defines a modular form, but I presume the deeper properties we needn't worry about. I see from the term 4 q^5 that there are four terms of length sqrt(5), the shortest 9-limit dissonances. These turn out to be 7/15, 5/21, 21/5, and 15/7, which we can also write as 14/5, 20/21, 21/20 and 15/14.
top of page bottom of page up down


Message: 3314 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 00:16:24

Subject: Re: yahoo spaced out (was re: algorithm sought)

From: monz

> From: Carl Lumma <carl@xxxxx.xxx> > To: monz <joemonz@xxxxx.xxx> > Sent: Wednesday, January 16, 2002 12:45 PM > Subject: Re: [tuning-math] yahoo spaced out (was re: algorithm sought) > > > Yahoo is undoing the 30-year-old tradition of ASCI art in > mailing lists, for no apparent reason whatever.
Yeah, when I read this post this afternoon (just before leaving for work), I wrote to the guy at Yahoo who handles mail regarding copyright violations -- only because I couldn't find anyone appropriate to contact. I explained this business to him, and gave him a sample link to a tuning-math post with lots of ASCII graphics. Waiting for a response now. -monz _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at Yahoo! Mail - The best web-based email! * [with cont.] (Wayb.)
top of page bottom of page up down


Message: 3315 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 01:00:24

Subject: Re: algorithm sought

From: paulerlich

--- In tuning-math@y..., "genewardsmith" <genewardsmith@j...> wrote:
> --- In tuning-math@y..., "paulerlich" <paul@s...> wrote: >
>> How about 1:15? >
> The 7-limit quadratic form is
I thought we were talking 11-limit.
> I see from the term 4 q^5 that there are four terms of length sqrt
(5), the shortest 9-limit dissonances. These turn out to be 7/15, 5/21, 21/5, and 15/7, which we can also write as
> 14/5, 20/21, 21/20 and 15/14.
This holds true for the 11-limit, I presume? BTW, the taxicab metric and Kees' lattice, which I keep banging my head against but am not enough of a mathematician for, would seem to be a more natural approach -- this should interest Carl especially.
top of page bottom of page up down


Message: 3316 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 00:45:12

Subject: Re: [tuning] Re: badly tuned remote overtones

From: monz

> From: paulerlich <paul@xxxxxxxxxxxxx.xxx> > To: <tuning-math@xxxxxxxxxxx.xxx> > Sent: Wednesday, January 16, 2002 4:00 PM > Subject: [tuning-math] [tuning] Re: badly tuned remote overtones > > > --- In tuning-math@y..., "monz" <joemonz@y...> wrote: >
>>> Right, but . . . did he apply any of them explicitly within >>> any of the 'constructions of unison vectors' you gleaned >>> from him? Otherwise, you're just "assuming the answer". >> >>
>> Well, on p 176 of _Harmonielehre_ (p 155 of the Carter translation), >> Schoenberg illustrates the "Circle of 5ths", and explicitly notates >> the equivalences Cb=B, Gb=F#, and Db=C# for the major keys, and >> ab=g#, eb=g#, and bb=a# for the minor keys that are +5, +6, >> and +7 "5ths" (respectively) from the origin C-major/a-minor. >
> OK, but isn't this separate from the 'constructions of unison > vectors' in which Schoenberg tries to arrive at a 13-limit > justification of 12-tET?
Careful, Paul ... the 13-limit justification of 12-tET only came in 1927/34, in the lecture/article "Problems of Harmony". Here, I'm specifically interested in Schoenberg's 1911 theory (really 1910, published a year later), and this only goes up to 11-limit and does not claim to generate the entire 12-tET scale. The _Harmonielehre_ illustration only results in a 9-tone scale (C, D, Eb, E, F, G, A, Bb, B) if one considers the unison-vectors to be tempered out. What I've been trying to do is to gather as much data as possible on Schoenberg's conception of the 12-tET scale at the time he wrote _Harmonielehre_, to construct periodicity-blocks which "explain" the finity of his tonal universe at that time. As I've already noted, the 13-limit unison-vectors in his 1927/34 explanation clearly delineate a 12-tone PB. But the earlier version from _Harmonielehre_ has been harder to glean. The misprint in the diagram in the Carter translation certainly didn't help me with this! Now I'm *really* glad that I finally went thru the effort to obtain a copy of the original German edition!
> I mean, this is traditional harmony, which Schoenberg loves > to explain, but is trying to break away from in his own music, no?
Yes, you're right about that. But the whole of _Harmonielehre_ is permeated with the quest for "the truth", and that starts right with these diagrams I've been examining. Schoenberg felt that he should begin by tearing apart musical sounds themselves, to study just what it is that we hear. And he finds that the overtones that are already present in every harmonic timbre seem not only to agree quite well with the construction of chords in traditional tonality, but also to offer a paradigm for the construction of the more unusual chords he wanted to put into his own music of the time. Aside from the overtone and circle-of-5th diagrams, Schoenberg offers very little else in the way of graphical assistance other than the copious examples in musical staff notation. And he continually stressed, not only in this book but for the rest of his life, that his style was an *evolution* out of what he had learned from Bach, Mozart, Beethoven, Brahms, Wagner, and Mahler. So his explanations of traditional harmony are entirely relevant to his own pantonal/atonal work, from his perpective.
>> To my mind, the 3-limit (linear, 1-D) is both historically and >> conceptually more basic than 5-limit (planar, 2-D). The >> notational difference between a "sharp" and what later became >> its enharmonically equivalent "flat", ocurred first in Pythagorean >> tuning. And so, along this line of reasoning, the Pythagorean >> comma is historically and conceptually a more basic enharmonicity >> than any of the 5-limit examples. However, as implied above, I >> will also grant the possibility that Schoenberg may have intended >> the diesis as a unison-vector, and will examine that case below >> as well. >
> Schoenberg sees the diatonic scale not as an essentially 3-limit > entity, as I do, but as an essentially 5-limit entity. So why would > the chromatic scale fall back to 3-limit in his thinking? Doesn't > seem to make sense.
Wow, I have to concede that you're right about that, Paul! Very good. This is analagous to the case of Ben Johnston. His "basic scale" is exactly the same 7-tone 5-limit JI diatonic scale Schoenberg illustrates in his first diagram, which goes up to the 6th harmonic on F, C, and G. Schoenberg's second diagram (on the very next page in _Harmonielehre_) is the one which goes up to the 12th harmonics (and the isolated 16th in one case) to illustrate how E and B "won" over Eb and Bb in the diatonic scale. So yes, I can see that it's much more likely that a 5-limit unison-vector would come into play, than a 3-limit one. In fact, my guess is that Schoenberg most likely thought of "normal" pitch-relationships as having a basis in some kind of meantone/12-EDO hybrid, which after all is what the notation "spells" ... unless one is assuming Pythagorean tuning.
> >> >>
>> Also, I understand Gene's "notation" a little better now. >> So, taking this particular matrix as an example, >> >> >> 2 3 5 7 11 unison-vector ~cents >> >> [ -2 2 1 0 -1 ] = 45:44 38.90577323 >> [-19 12 0 0 0 ] = 531441:524288 23.46001038 >> [ -5 1 0 0 1 ] = 33:32 53.27294323 >> [ 6 -2 0 -1 0 ] = 64:63 27.2640918 >> [ -4 4 -1 0 0 ] = 81:80 21.5062896 >> >> >> inverse >> >> [ 12 -7 12 0 12 ] >> [ 19 -11 19 0 19 ] >> [ 28 -16 28 0 27 ] >> [ 34 -20 34 -1 34 ] >> [ 41 -24 42 0 41 ] >> > Adjoint?
Well ... *this* is the adjoint of this matrix: [ -12 7 -12 0 -12 ] [ -19 11 -19 0 -19 ] [ -28 16 -28 0 -27 ] [ -34 20 -34 1 -34 ] [ -41 24 -42 0 -41 ] and since the determinant is | -1 |, the inverse is as I gave it. -monz . But since the determinant is _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at Yahoo! Mail - The best web-based email! * [with cont.] (Wayb.)
top of page bottom of page up down


Message: 3317 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 00:50:11

Subject: Re: ERROR IN CARTER'S SCHOENBERG (Re: badly tuned remote overtones)

From: monz

> From: paulerlich <paul@xxxxxxxxxxxxx.xxx> > To: <tuning-math@xxxxxxxxxxx.xxx> > Sent: Wednesday, January 16, 2002 4:09 PM > Subject: [tuning-math] ERROR IN CARTER'S SCHOENBERG (Re: badly tuned remote overtones) > >
>> So I will give Paul Erlich the benefit of the doubt >> and assume that Schoenberg was following tradition as >> closely as possible in his note naming, by thinking in >> terms of meantone, and therefore the most likely candidate >> for the other independent UV is the 5-limit diesis >> 128:125 = [ 7 0 -3 ] . >
> This is not like anything I would say. Giving me the benefit of the > doubt, huh? This reminds me of where you recently told Klaus that you > disagree with me about sustained notes changing their > intonation . . . if your interpretations of Schoenberg and other > theorists are as good as your interpretation of me . . . well, > whatever, I love you Monz, you get a big ice cream with a cherry on > top.
OK, so I misunderstand things sometimes too ... like most people. Sorry. :( All I'm saying is that *if* Schoenberg had any kind of meantone conception in mind -- which I think is quite likely, given its ubiquity in European music, right down to our current notation, (*this* is why I'm giving a nod to you, Paul!!) -- then the unison-vector he'll bump into, in 1/4-comma meantone, is 128:125.
> You better tell the tuning list if you believe that Partch's > criticism of Schoenberg was based on a mistranslation!!!
Wow, I hadn't even *thought* of *that*! I'll have to get some sleep and check those details tomorrow. But thanks for the suggestion. -monz _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at Yahoo! Mail - The best web-based email! * [with cont.] (Wayb.)
top of page bottom of page up down


Message: 3318 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 09:12:32

Subject: The 'Arabic' temperament

From: genewardsmith

I put the 11-temperament programs which I had written but not used to
work on this, and got [2,8,-11,5,8,-23,1,-48,-16,52] for the wedgie
from the valside. I then plugged 81/80, 121/120 and 6144/6125 in, and
found it had torsion. This was depressing until I found that 
81/80, 121/120 and 176/175 (among other possibilities) also works.
If no one seriously objects I'll dub the linear 11-limit temperament
with the above wedgie and kernel basis the "Arabic".

From the wedgie, I get as a period matrix

[ 0  1]
[ 2  1]
[ 8  0]
[-11 6]
[ 5  2]

This has generators a = 9.0053365/31 = 15.9772099/55; b = 1. This
is,of course, the neutral third of about 11/9, or 348.59367 cents if
you wantto get picky.

badness = 293.7893492

rms = 6.681354997

g = 9.680613914


The 7-tone MOS is Mohajira:

[5, 4, 5, 4, 5, 4, 4]
[9, 7, 9, 7, 9, 7, 7]


top of page bottom of page up down


Message: 3319 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 12:05 +0

Subject: Re: algorithm sought

From: graham@xxxxxxxxxx.xx.xx

In-Reply-To: <a24ku6+60k1@xxxxxxx.xxx>
carl wrote:

> ...than intervals. You're probably right. It seems to get around > the problem of having more than one instance of the same thing, > turing n^2 into n!. But for my ultimate purpose, I'll need to > consider all inversions if I use notes, which I think turns it > back into n^2.
n^2 is a piddling calculation for a modern computer. In this case, though it is quite a bit higher. But n is also quite low, so no problem.
> I didn't see your python post, but here's the scheme: > > (define combo > (lambda (k ls) > (if (zero? k) > (list (list)) > (if (null? ls) > (list) > (append (combo k (cdr ls)) > (map (lambda (y) (cons (car ls) y)) > (combo (sub1 k) (cdr ls)))))))) > > You can't count lines of code in scheme like you can in > a block-structured imperitive language. And comparing > expressions -- () pairs -- to # of lines isn't fair. Frankly, > I don't no how they measure algorithms in scheme, but I'll > bet egg's benedict you can't get more compact than this, > considering this exposes some of the actual operations on > the UTM tape (car and cdr).
The best way of measuring algorithms in any language is to use a profiler. I'm guessing the biggest constraints on this in Python are going to be the recursion depth and the size of the returned list. It works fine for 4 combinations of 29, which is the same as enumerating all 5-note 11-limit chords. 1001 of them, apparently. 5 from 29 takes 12 seconds. 6 from 29 takes about 52 seconds. Scheme is optimised for recursion, so that won't be a problem. You can get around storing the resulting list in both languages if you really want to. I'm guessing that'll be easier for Scheme.
> Without a compiler, I think any of these brute-force methods > are all off limits, so to speak. Scheme compilers are all mucho > bucks, industrial (though Chez hints at plans to release a > consumer version within the year!), except ones that "compile" > to C, which I've never been able to get to work. Anyhow, it's > high time I learned another language.
I thought there were good, Free compilers for either Scheme or Common Lisp. Whatever, a compiler should only buy you about an order of magnitude improvement, which is the difference between finishing while you wait and finishing while you go and make a cup of tea. The real problem, not finishing at all within the lifetime of the universe, won't be solved by either.
> In the mean time, I'm convinced there's a better way than > brute force. Gene may have found one...
The method on <Anomalous Saturated Suspensions * [with cont.] (Wayb.)> combined with a normal otonal and utonal enumeration will work for JI. All you need brute force for is to verify that, which is certainly possible in the 19-limit. Asses are only 4 note chords, which is 3-from-whatever. If there are exceptions, you'll find them with a 4-from-whatever search. I don't know off-hand how many intervals there are in the 15-limit. But it can't be more than 64. 4 from 64 is 229 seconds for my Python algorithm. If you're actually doing something with them, it should still be within the cup-of-tea timescale. There's a lot of disk activity, so use a generator or whatever and it should speed it up. Rejecting all chords that are already outside the limit before you add notes to them should speed it up a great deal more. Give it a try. The usual rule of thumb is that Scheme is faster than Python. Perhaps that assumes you have a compiler. Graham
top of page bottom of page up down


Message: 3320 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 12:05 +0

Subject: Re: yahoo spaced out (was re: algorithm sought)

From: graham@xxxxxxxxxx.xx.xx

In-Reply-To: <a24mrc+bhm9@xxxxxxx.xxx>
carl wrote:

> What is supposed to be the point of the space-removal > thing? I can't fathom it. And "view unformatted message" > does nothing (my eyes, the goggles do nothing!).
It's partly to reduce the bandwidth, not having to send so many spaces. Also, to allow them to put adverts in the messages.
> Needless to say, my beautiful indenting was destroyed.
Be glad you're not using a language with syntactic indentation. Graham
top of page bottom of page up down


Message: 3321 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 13:21:01

Subject: Re: Archive

From: Robert Walker

Hi there,

I'll be making an archive of the tuning-math posts in a few
days time.

It is opt in, so let me know if you want your posts included.
I won't assume that _anyone_ wants to be included unless you
say so.

To participate in the archive, send an e-mail to me to say so, e.g. at

tuning_archive @ rcwalker.freeserve.co.uk
(remove the spaces about the @)
tuning_archive@xxxxxxxx.xxxxxxxxx.xx.xx

I'll assume it is okay to let your posts be found in search engines,
and to include them on cd for use of members of the tuning group,
unless you say otherwise.

For practical reasons, I'll treat the tuning group and its offshoots
such as tuning-math and harmonic_entropy as all the same community.

...........................................

Some members may wish to exclude some of their posts
from the archive.

If so, say so in the message, and give a list of the
dates for the posts you want *included*.

E.g.
"include all the posts from 31st Nov. 1999 to 1st May 2001."
and all posts from 2nd Oct 2001 to present"

If you use numbers for the month, I'll assume the american 
convention (with month first) unless you say otherwise,:

"include 11/31/99 - 5/1/01, 10/2/01 - present"

...........................................


Robert


top of page bottom of page up down


Message: 3322 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 14:07:39

Subject: Re: algorithm sought

From: clumma

>> >..than intervals. You're probably right. It seems to get around >> the problem of having more than one instance of the same thing, >> turing n^2 into n!. But for my ultimate purpose, I'll need to >> consider all inversions if I use notes, which I think turns it >> back into n^2. >
>n^2 is a piddling calculation for a modern computer.
Whoops. That was supposed to be n^k for intervals, n!/(n-k)! for pitches, and k(n!)/(n-k)! for all inversions of pitches.
>The best way of measuring algorithms in any language is to use a >profiler.
Measuring algorithms, or their compactness in different languages?
>I'm guessing the biggest constraints on this in Python are going >to be the recursion depth and the size of the returned list. It >works fine for 4 combinations of 29, which is the same as >enumerating all 5-note 11-limit chords.
Aren't there any ASSes that contain more than one instance of an 11-limit interval?
>> Without a compiler, I think any of these brute-force methods >> are all off limits, so to speak. Scheme compilers are all mucho >> bucks, industrial (though Chez hints at plans to release a >> consumer version within the year!), except ones that "compile" >> to C, which I've never been able to get to work. Anyhow, it's >> high time I learned another language. >
>I thought there were good, Free compilers for either Scheme or >Common Lisp.
Not that I could find. The only ones I know of are Gambit scheme (C translator), chicken (ditto), and something that comes with Dr. Scheme from Rice university, which I can't stand, though I probably should have looked at more closely. I did verify that the Dr. Scheme interpreter blows up at the same point the Chez one does.
>Whatever, a compiler should only buy you about an order of >magnitude improvement, which is the difference between finishing >while you wait and finishing while you go and make a cup of tea. Mmm, tea.
I'm not sure. The problem isn't cps, it's memory. The garbage collection thingy runs out of RAM and starts hitting the swap, and you're better off waiting for the sun to collapse than for Windows to recover. I assume compiled code doesn't have this problem, for some reason.
>The real problem, not finishing at all within the lifetime of >the universe, won't be solved by either.
There's nothing NP here that I can see.
>> In the mean time, I'm convinced there's a better way than >> brute force. Gene may have found one... >
>The method on <Anomalous Saturated Suspensions * [with cont.] (Wayb.)> combined >with a normal otonal and utonal enumeration will work for JI.
As in, I have to tack on the ASSes. I can just use your table. I admitted this is quite satisfactory, that I was just being obstinate, from the start. It isn't just question of results for me -- it's understanding. The nature of the problem is fairly simple, and the ASSes and o- and u-tonalities should all spin out from the same process. If you could get your ASS method to produce Partchian tonalities... Actually, the picking notes instead of intervals thing is worth trying. I'll do that. Thanks. I'm sure I've given away to you, by stating my eventual need for inversions, that I'm going after n-adic uniqueness.
>I don't know off-hand how many intervals there are in the 15- >limit. But it can't be more than 64. 4 from 64 is 229 seconds >for my Python algorithm. If you're actually doing something >with them, it should still be within the cup-of-tea timescale. >There's a lot of disk activity, so use a generator or whatever >and it should speed it up. Generator? >Rejecting all chords that are already outside the limit before >you add notes to them should speed it up a great deal more.
Yeah, it pays to check every round I think.
>Give it a try. Will do. >The usual rule of thumb is that Scheme is faster than Python. >Perhaps that assumes you have a compiler.
It must. I suspect also that companies like Chez intentionally leave out smarts from their free interpreters in order to protect their industrial licenses. The interp. you get for free (the "petite") isn't the same one that comes with a $3K licensce, I'll wager.
>Be glad you're not using a language with syntactic indentation.
True, ()'s are great. Any scheme expression can be unambiguously parsed no matter how you lay it out (even on one line). -Carl
top of page bottom of page up down


Message: 3323 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 14:12:03

Subject: Re: algorithm sought

From: clumma

>> >he best way of measuring algorithms in any language is to use a >> profiler. >
>Measuring algorithms, or their compactness in different languages?
As in, I have no idea what a profiler is, but it sounds like something which gets rid of the difference between languages rather than exposing it?
>> I'm guessing the biggest constraints on this in Python are going >> to be the recursion depth and the size of the returned list. It >> works fine for 4 combinations of 29, which is the same as >> enumerating all 5-note 11-limit chords. >
>Aren't there any ASSes that contain more than one instance of >an 11-limit interval?
Of course there are -- they wouldn't be ASSes otherwise, they'd just be subsets of Partchian tonalities. Hmm, that's a shortcut right there. . . . -C.
top of page bottom of page up down


Message: 3324 - Contents - Hide Contents

Date: Thu, 17 Jan 2002 15:07 +0

Subject: Re: algorithm sought

From: graham@xxxxxxxxxx.xx.xx

In-Reply-To: <a26lrb+nnau@xxxxxxx.xxx>
Me:
>> The best way of measuring algorithms in any language is to use a >> profiler. Carl:
> Measuring algorithms, or their compactness in different languages?
Measuring efficiency in one particular language implementation. Profilers are tools that inspect running code, and record how much time is spent in each section. That means you can go straight to the least efficient parts, and not waste effort optimising things that aren't a problem in the first place. Carl:
> Aren't there any ASSes that contain more than one instance of > an 11-limit interval?
If you're measuring from the root, that'd mean two notes at the same pitch, which is silly. Me:
>> I thought there were good, Free compilers for either Scheme or >> Common Lisp. Carl:
> Not that I could find. The only ones I know of are Gambit scheme > (C translator), chicken (ditto), and something that comes with > Dr. Scheme from Rice university, which I can't stand, though I > probably should have looked at more closely. I did verify that > the Dr. Scheme interpreter blows up at the same point the Chez > one does.
There's a GNU interpreter for Common Lisp, at least. Perhaps it's one of the C translators. Me:
>> Whatever, a compiler should only buy you about an order of >> magnitude improvement, which is the difference between finishing >> while you wait and finishing while you go and make a cup of tea. Carl: > Mmm, tea. >
> I'm not sure. The problem isn't cps, it's memory. The garbage > collection thingy runs out of RAM and starts hitting the swap, > and you're better off waiting for the sun to collapse than for > Windows to recover. I assume compiled code doesn't have this > problem, for some reason.
You may need a good garbage collector, but that's independent of being compiled or interpreted. I expect a GNU interpreter will be fine. RMS is a Lisp programmer, after all.
>> The real problem, not finishing at all within the lifetime of >> the universe, won't be solved by either. >
> There's nothing NP here that I can see.
Polynomial time is all you need if it gets complex enough. 10^50 operations does it. That's roughly the total number of chess games. And it's only 48!. Me:
>> The method on <Anomalous Saturated Suspensions * [with cont.] (Wayb.)> combined >> with a normal otonal and utonal enumeration will work for JI. Carl:
> As in, I have to tack on the ASSes. I can just use your table. > I admitted this is quite satisfactory, that I was just being > obstinate, from the start. It isn't just question of results > for me -- it's understanding. The nature of the problem is > fairly simple, and the ASSes and o- and u-tonalities should all > spin out from the same process. If you could get your ASS > method to produce Partchian tonalities...
It might be useful to generalise the method to work with inharmonic timbres, provided the consonances are defined on a short list of partials. You can get all three chord types as subsets of Euler genera. Perhaps that would be a helpful approach. Me:
>> I don't know off-hand how many intervals there are in the 15- >> limit. But it can't be more than 64. 4 from 64 is 229 seconds >> for my Python algorithm. If you're actually doing something >> with them, it should still be within the cup-of-tea timescale. >> There's a lot of disk activity, so use a generator or whatever >> and it should speed it up. Carl: > Generator?
See <PEP 255 -- Simple Generators * [with cont.] (Wayb.)> for Simple Generators. The code I have now uses a function to generate a huge list, and then iterates over it. If I turned the function into a generator, it'd only have to return one result at a time, without the code getting much more complex. I haven't looked at this yet, partly because I'm trying to keep compatibility with older versions of Python. I think you can do similar, but more advanced things with the Lisp family. The interpreter/compiler may even do them for you. You can also write a function that takes a function as argument, and calls it for each item. But in this case it'll probably be easier to forget about reusability, and put the functionality in the middle of the combinations code, so that the following can work:
>> Rejecting all chords that are already outside the limit before >> you add notes to them should speed it up a great deal more. >
> Yeah, it pays to check every round I think. >
>> Give it a try. > > Will do. Me:
>> The usual rule of thumb is that Scheme is faster than Python. >> Perhaps that assumes you have a compiler. Carl:
> It must. I suspect also that companies like Chez intentionally > leave out smarts from their free interpreters in order to protect > their industrial licenses. The interp. you get for free (the > "petite") isn't the same one that comes with a $3K licensce, I'll > wager.
I think a GNU implementation of some Lisp dialect has beaten Python in benchmarks. Me:
>> Be glad you're not using a language with syntactic indentation. Carl:
> True, ()'s are great. Any scheme expression can be unambiguously > parsed no matter how you lay it out (even on one line).
Oh, syntactic indentation's fine in itself. But it does mean your code gets rendered invalid by Yahoo's white space stripping. Graham
top of page bottom of page up

Previous Next

3000 3050 3100 3150 3200 3250 3300 3350 3400 3450 3500 3550 3600 3650 3700 3750 3800 3850 3900 3950

3300 - 3325 -

top of page