## mathematical optimization: find the minimum with the positive definition matrix constraint

Let's say I want to find the minimum value of the determinant of a matrix under the condition that the matrix is ​​positively defined. Then I try:

M = {{a,0},{0,b}}

FindMinimum[{Det[M],a>=1,b>=1,PositiveDefiniteMatrixQ[M]},{a,b}]


This returns an error that Constraints in {False} are not all equality or inequality constraints..., suggesting that the PositiveDefiniteMatrixQ is being evaluated immediately by arbitrary a,b and not evaluated every iteration for a,b values.

Then I could try to delay the evaluation of PositiveDefiniteMatrixQ with Delayed, which returns a similar error Constraints in {Delayed[PositiveDefiniteMatrixQ[M]],a>=1,b>=1} are not all equality or inequality constraints.

How can I impose such a restriction on the FindMinimum function?

## arithmetic geometry: equivalent definition of the ring $B_ {cris}$

I am reading Laurie's note on the Fargues-Fontaine curve and I think it uses a different definition of $$B_ {cris}$$generally when $$R$$ It is a perfect ring of features $$p$$, $$B + cris (R)$$ define as $$p$$-Full enclosure of the power split envelope of the map $$W (R) a R$$ Y $$B_ {cris} = B + {cris}$$.

but in these notes when R is the valuation ring of an algebraically closed perfect field $$B$$ defined as the completion of $$frac (W (R))$$ with respect to all gauss standards and the Fargues-Fontaine curve defined by it.

I want to know the relationship between $$B$$ Y $$B_ {cris}$$ In general, is it true that they are isomorphic if R is the titration ring of a prefective field?

## automaton: definition of a regular grammar but without $Q rightarrow varepsilon$

I defined a regular grammar (FSM), which begins with $$ab$$ and ends with $$ba$$ as follows:

1. $$S rightarrow aS$$
2. $$S rightarrow bS$$
3. $$S rightarrow aT$$
4. $$T rightarrow bR$$
5. $$R rightarrow aQ$$
6. $$Q rightarrow aQ$$
7. $$Q rightarrow bQ$$
8. $$Q rightarrow epsilon$$

, where $$S$$ is the initial element $$epsilon$$ It is the empty element (null) and the rest are only variables.

Rules 6, 7 and 8 are there, so we can finish a word. However, I am trying to rewrite my grammar but without the $$Q rightarrow epsilon$$. I cannot use the empty element.

Can be done? I am not sure how.

Thank you

## Probability – Definition of a permutation $r-$

I am trying to understand the definition of a $$r-$$permutation. Suppose you have $$7$$ aligned seats and $$7$$ different people, there are $$7!$$ Different ways to settle them.

Suppose I am trying to sit down $$7$$ different people in $$9$$ seats, is that when I use the formula $$P (n, r) = frac {n!} {(N-r)!}$$?

My opinion is that there are two same seats (empty), so the number of distinguishable permutations is $$9! / two!$$ what is that previous formula So my question is, is $$r-$$ permutation a method to discover the permutation of $$r$$ objects in $$n$$ boxes when $$n ge k$$?

Follow up if you have time: If the seats formed a circle, we would divide by $$7$$ How are the seats equivalent to the correct rotation?

## syntax – Definition domain for the following variables: is it possible to derive in Mathematica?

Consider 4 4 vectors
$$P_ {0} = (E_ {0}, 0,0, sqrt {E_ {0} ^ {2} -m_ {0} 2}}, P_ {i} = (E_ {i} , p_ {i} s ( theta_ {i}) c ( phi_ {i}), p_ {i} s ( phi_ {i}) s ( theta_ {i}), p_ {i} c ( theta_ {i})),$$
with $$c equiv cos, s equiv sin$$, $$p_ {i} equiv sqrt {E_ {i} ^ {2} -m_ {i} ^ {2}}$$ and scalar products
$$P_ {i} cdot P_ {j} equiv P_ {i} ^ {0} P_ {j} ^ {0} – sum_ {k = 1} ^ {3} P_ {i} ^ {k} P_ { j} k$$
$$m_ {0-3}, E_ {0}$$ play the role of real parameters, with $$E_ {0}> m_ {0}> m_ {1} + m_ {2} + m_ {3}$$ Y $$E_ {i} geqslant m_ {i}$$, While $$E_ {i}, theta_ {i}, phi_ {i}$$ they are variable

The implicit region of the definition of $$E_ {i}, theta_ {i}, phi_ {i}$$ is given by
$$tag 1 P_ {3} = P_ {0} -P_ {1} -P_ {2},$$
$$tag 2 s_ {12, text {min}} (s_ {23})
where $$s_ {ij} = m_ {i} 2 + m_ {j} 2 + 2P_ {i} cdot P_ {j}$$Y
$$tag 3 s_ {12, text {min} / text {max}} = m_ {1} ^ {2} + m_ {2} ^ {2} – frac {1} {2s_ {23}} bigg (s_ {23} -m_ {0} 2 + m_ {1} 2) (s_ {23} -m_ {2} 2 -m_ {3} {2}) pm \ pm sqrt { lambda (s_ {23}, m_ {0} 2, m_ {1} 2) lambda (s_ {23}, m_ {2} ^ {2} , m_ {2})} bigg),$$
$$tag 4 s_ {23, text {min}} = (m_ {2} + m_ {3}) ^ {2}, s_ {23, text {max}} = (m_ {0} -m_ {1} 2,? (A, b, c) = (abc) 2 -4bc$$

I need to integrate a function $$f (E_ {i}, theta_ {i}, phi_ {i})$$ about the domain of the definition $$(1) – (4)$$ of the mentioned variables. Is it possible to derive the definition domain in Mathematica, at least implicitly, to perform the integration? There are so many variables …

## abstract algebra: show that an identity element does not exist with the definition

Thank you for contributing a response to Mathematics Stack Exchange!

But avoid

• Make statements based on opinion; Support them with references or personal experience.

Use MathJax to format equations. MathJax reference.

## What it means: automatic in a function definition

I've been trying to solve it, without success. When reading someone else's code, I find a function definition, where it says something like this

UserDefinedFunction[a_Integer, b_Integer, c_Integer, d_, k_:Automatic]/;Abs[a]>b:=0;


I understand that it means that the function should return 0 if Abs [a]> b, but what does the label mean? Automatic in k_:Automatic half? What is the widest use? How does it relate when I draw something and tell Mathematica to assign PlotRange -> Automatic?

## Information theory – Definition of collision entropy

Collision entropy is defined as Renyi's entropy for that matter. $$alpha = 2$$. It is given by

$$mathrm {H} _ {2} (X) = – log sum_ {i = 1} ^ {n} p_ {i} ^ {2} tag {1}$$

Take two random variables $$X$$ Y $$X & # 39;$$ that follow the same probability distribution. The probability of a collision is simply $$P _ { rm coll} = sum_ {i = 1} ^ {n} p_ {i} 2$$. Then I would expect us to say that collision entropy is just $$H (P rm coll)$$ that is to say

$$– left ( sum_ {i = 1} ^ {n} p_ {i} ^ {2} right) log left ( sum_ {i = 1} ^ {n} p_ {i} ^ { 2} right) – left (1 – sum_ {i = 1} ^ {n} p_ {i} ^ {2} right) log left (1- sum_ {i = 1} ^ {n } p_ {i} ^ {2} right)$$

This is in analogy with binary entropy but with the probability replaced by the probability of a collision.

What is the motivation behind choosing $$(1)$$ be the definition of collision entropy?

## How to find f (x) y (a) as f & # 39; (A) is equal to the definition of the limit given above

Consider lim h → 0 frac{sqrt(4){16+h}-2}{h}

a) find a and f (x) such that f & # 39; (a) is equal to the limit given above
This form is made with different types of field: inline_template, checkbox, checkboxes, radios.