## Monday, April 22, 2013

### HW #9

1. Let I be an ideal in C[x_1..x_n]. Let G_I be the sum of all the standard monomials. For example, if I = < x-y^2 > in C[x,y] with leading term x, then G_I = 1/(1-y). Whereas if the leading term is y^2, then G_I = (1+y)/(1-x). (Right?)

a. If I is homogeneous, what is the relation of G_I and the Hilbert series H_I?

b. Show that G_I is always a rational function.

c. If F is a subset of {1..n}, let t_F = product_{f in F} x_f / (1-x_f).
If I is generated by squarefree monomials, show G_I = sum_F t_F, where the sum is over those F such that product_{f in F} x_f is a standard monomial.

2. Consider pairs {(A,B) nxn matrices : AB, BA upper triangular}.

a. Use the "codim" command in Macaulay 2 to check that, for n=2,3, this is a complete intersection.

b. Assuming this for general n, what's its degree?

c. Decompose it (for n=2,3) and determine the degrees of the components. Check that they add up to the answer in (b).

3. In C[a,b,c,d], let I be the ideal of the ab-plane union the cd-plane.

a. What is that ideal?

b. Let J be the ideal < a + c - 1, b + d - 2 >. Describe V(I+J). Is it radical?

c. Let K be the ideal < a + c, b + d >. Describe V(I+J) and compute its degree.

d. Do you find (b) vs. (c) disturbing?

## Saturday, April 13, 2013

### HW #8, due April 19

1. Let I be an ideal in C[x_1..x_n], and let sqrt(J) denote the radical of J.

2. Show (A intersect B)^h = A^h intersect B^h, where A,B are ideals in C[x_1..x_n].

3. Show (A+B)^h contains A^h + B^h.

4a. Find an example where this containment is proper.
b. Explain geometrically why you might expect this to happen.
(If you can do (b) first, it might help you with (a); on the other hand it's easy to luck into an example of (a).)

5. How can you compute (A+B)^h from A^h + B^h? (Again, 4(b) should help.)

6. Let R = C[a,b,c,d], and p = a*(ad-bc)*d. Let I = < p >.
Decompose I as an intersection of prime ideals.
If you add some of those prime ideals together, you get new ideals; decompose all those.
Keep adding and decomposing until you can't find any new ideals this way.
Draw the poset of ideals, with each one labeled with its Hilbert polynomial.

If you use Macaulay 2 to do #6, so much the better, though it's not really so hard to do this little example directly. Anyway if you do turn in your code, too.

## Saturday, April 06, 2013

### HW #7, due 4/11

Let P be the set of polynomials {p(d)} in one variable, d, such that p(d) is an integer for all integer d. Of course this includes Z[d], but is larger, e.g. p(d) = d(d-1)/2. In particular, this includes Hilbert polynomials.

1. For each k, show c_k(d) := d(d-1)...(d-k+1)/k! is in P.

2. Show that the {c_k} are a basis of C[d].

3. Define Delta : P -> P by (Delta p)(d) = p(d) - p(d+1). Compute Delta(c_k).

4. Show that the {c_k} are a Z-basis of P, when we expand an element of P in {c_k}, the coefficients are in Z.

5. Define the _Hilbert series_ H_I(t) of a homogeneous ideal I as the series \sum_d h_I(d) t^d. Show that if I is the 0 ideal in C[x_0,...,x_n], then H_I(t) = 1/(1-t)^{n+1}.

6. Recall that r is a _zero divisor_ in R/I if r is not zero, and there exists s not zero such that rs = 0. Assume that I is graded and r is homogeneous of degree k and not a zero divisor.
It turns out that H_{I + < r >} is something times H_I; figure out what.

7. Assume that I is graded, r is homogeneous of degree k, and the formula you figured out in #6 holds. Show that, conversely, this implies r is not a zero divisor.

## Monday, March 11, 2013

### Macaulay 2 on-line!

It's still semi-secret, but you can run Macaulay 2 on-line if you didn't manage to install it.

## Sunday, March 03, 2013

"1. Let r be a positive irrational number. Write x^a y^b < x^c y^d if a+rb < c+rd.
Show that this defines a monomial order."

We need to check that it's a total order (this uses the irrationality), that any decreasing sequence of monomials terminates (this uses the positivity), that it's transitive (really easy), and that a < b implies ac < bc (really easy).

"2. Consider the vector space of 2x4 matrices M, with entries (m_ij), and let X = {M : M's rows are linearly dependent}.
For i < j, both from 1 to 4, let p_ij be the 2x2 determinant using columns i,j of M.
What's the relation of p_ij to X?"

Each p_ij vanishes on X. More specifically, X is the vanishing set of the ideal generated by the six p_ij.

"3. (continuing 2) Lex-order the variables m_11, m_12, m_13, m_14, m_21, m_22, m_23, m_24.
What are the leading terms of the six guys p_ij?"

p_ij = m_1i m_2j - m_1j m_2i. The lex-ordering sees the m_1i term first, if i < j. So the leading term is m_1i m_2j.

"4. (continuing) Show that the p_ij are a Gr\"obner basis, by computing S-pairs."

If the LCM of the two leading terms is just the product, the S-pair quickly reduces to 0.
So the cases to consider are S(p_ij, p_ik), S(p_ik, p_jk), where i < j < k.
(Secretly, I'm thinking about 2xn matrices, not just 2x4, here.)

I like to do this by starting with LCM - LCM, reduce each one using a different generator, get to something less obviously in I, then reduce that until stuck.

-> m_2i m_1j m_2k - m_2i m_2j m_1k
-> m_2i m_2j m_1k - m_2i m_2j m_1k
= 0

-> m_2i m_1j m_1k - m_1i m_2j m_1k
-> m_2i m_1j m_1k - m_2i m_1j m_1k
= 0

### HW #6

Book problems (p48) 3.1, 3.3, 3.6.

#4. Let I = < f >, J = < g > be principal ideals in C[x_1...x_n].
a) Figure out what I intersect J is, given that it's supposed to give the union of the vanishing sets.
b) Follow the computation of (tI + (1-t)J) intersect C[x_1..x_n], explicitly, to see that you get the same answer as you did in part (a).

#5. Get Macaulay 2 running on some computer, to the extent that you can do the following computation. I've put [editorial comments] in the below to explain what's going on.

----------------------------------------------------------------------------
Macaulay2, version 1.4
with packages: ConwayPolynomials, Elimination, IntegralClosure, LLLBases, PrimaryDecomposition, ReesAlgebra, TangentCone

i1 : R = QQ[m_(1,1)..m_(2,3)];

[R is the ring of polynomials with rational coefficients in six variables m_(i,j)]

i2 : M = transpose genericMatrix(R,m_(1,1),3,2)

[M is the matrix with the matrix entries m_(i,j) ]

i3 : I = ideal {det M_{0,1}, det M_{1,2}}

[M_{list} is the submatrix using only those columns, numbered 0,...,width-1 as usual in computer science. Here we're asking that columns 0 and 1, and columns 1 and 2, be linearly dependent.]

i4 : cs = decompose I

[We did this example in class a couple of weeks ago; this ideal I is not a prime ideal, but the intersection of two prime ideals. In one of them column 1 vanishes, in the other columns 0 and 2 are linearly dependent. "decompose" finds the two.]

i5 : intersect cs

[This just recovers I -- it is indeed the intersection of the two.]

### Feb 28

The geometry of (tI + (1-t)J) intersect C[x_1...x_n]: it's the projection of (V(I) x 1) union (V(J) x 0), hence V(I) union V(J).

Example: we used this rule to compute the ideal vanishing on the set {(0,0), (1,1)}.

Colon ideals.
V(I : J) = the closure of V(I) \ V(J).
The definition of I : f^{infinity} (which we'll compute next time as an elimination).
Theorem: f is in the radical of I <=> I : f^{infinity} contains 1.

## Tuesday, February 26, 2013

### Feb 26

A C-algebra is a ring that's also a complex vector space, and a C-algebra homomorphism is a ring homomorphism that's C-linear.

If R is a C-algebra, then C-Spec(R) := {the C-algebra homs R->C}.

If R->S is a C-algebra homomorphism, there's a natural map C-Spec(S) -> C-Spec(R).

If R is a polynomial ring, C-Spec(R) is the corresponding vector space. If R is a polynomial ring modulo an ideal I, C-Spec(R) is V(I).

The inclusion of a polynomial subring corresponds, under taking C-Spec, to linear projection.

Intersecting an ideal I with a polynomial subring, called elimination, corresponds to taking the projection of V(I) and then taking the closure.

Theorem. If I,J are ideals in C[x_1..x_n], so tI, (1-t)J are in C[t,x_1..x_n], then
I intersect J = ((tI) + (1-t)J) intersect C[x_1..x_n].

## Friday, February 22, 2013

### HW #5

Book problems, p30: 2.4, 2.8, 2.10, 2.11 (assume Buchberger's algorithm, section 2.5), 2.12, 2.14.

### Feb 21

Proof of the Buchberger criterion for a Gr\"obner basis.