Uniqueness of minimal projections in smooth expanded matrix spaces

  • PDF / 312,544 Bytes
  • 15 Pages / 439.37 x 666.142 pts Page_size
  • 84 Downloads / 166 Views

DOWNLOAD

REPORT


Uniqueness of minimal projections in smooth expanded matrix spaces Michał Kozdeba ˛ 1 Received: 31 March 2020 / Accepted: 8 October 2020 © The Author(s) 2020

Abstract Let us consider the space M(n, m) of all real or complex matrices on n rows and m columns. In 2000 Lesław Skrzypek proved the uniqueness of minimal projection of this space onto its subspace M(n, 1) + M(1, m) which consists of all sums of matrices with constant rows and matrices with constant columns. We generalize this result using some new methods proved by Lewicki and Skrzypek (J Approx Theory 148:71–91, 2007). Let S be a space of all functions from X ×Y × Z into R or C, where X , Y , Z are finite sets. It could be interpreted as a space of three-dimensional matrices M(n, m, r ). Let T be a subspace of S consisting of all sums of functions which depend on one variable. Let S be equipped with a smooth norm .. We show that there exists the unique minimal projection of S onto its subspace T . Keywords Minimal projection · Rudin’s theorem · Groups of isometries · Unique projection Mathematics Subject Classification 43A07 · 46B28 · 46E30 · 47D03

1 Introduction At the beginning let us set up some basic terminology and notation. Definition 1 Let S be a Banach space and let T be a linear, closed subspace of S. An operator P : S → T is called a projection if P|T = id|T . We denote by P(S; T ) the set of all linear and continuous (with respect to the operator norm) projections.

Communicated by Adrian Constantin.

B 1

Michał Kozde˛ba [email protected] Department of Applied Mathematics, University of Agriculture in Krakow, Kraków, Poland

123

M. Kozde˛ba

Definition 2 A projection P0 ∈ P(S; T ) is called minimal if P0  = inf{P : P ∈ P(S; T )} =: λ(T ; S). In the theory of minimal projection three main problems are considered: existence and uniqueness of minimal projections [15–17,19–29] , finding estimates of the constant λ(T ; S) [2–5,7–13] and finding concrete formulas for minimal projections [6,9,18,24]. As one can see this theory is widely studied by many authors also recently [1,11,12,14,18,23]. Let X = {1, 2, 3, . . . , n}, Y = {1, 2, 3, . . . , m}, Z = {1, 2, 3, . . . , r }, where 3 ≤ n, m, r < +∞ are fixed. Define S = M(n, m, r ) as a set of all functions from X × Y × Z into R (or C). Let T be a subspace of S consisting of all sums of functions which depend on one variable, i.e. T = { f ∈ S : f (x, y, z) = g(x) + h(y) + i(z); g : X → R, h : Y → R, i : Z → R}(or C).

It is convenient to consider these spaces as a spaces of ”three-dimensional” matrices with real (or complex) values. Let M(1, 1, r ) be a subspace of a three-dimensional matrix space S with elements ai jk , such that ai1 j1 k = ai2 j2 k for any i 1 , i 2 ∈ {1, 2, . . . , n}, j1 , j2 ∈ {1, 2, . . . , m} and k ∈ {1, 2, . . . , r }. Analogously we define M(1, m, 1), M(n, 1, 1). Then we can write T = M(n, 1, 1)+M(1, m, 1)+M(1, 1, r ). Definition 3 Let n be a set of all permutations of {1, 2, . . . , n}. Define n × m × r = {π = α × β × γ , where : α ∈ n , β ∈ m , γ ∈