Login / Signup

Two Remarks on Graph Norms.

Frederik GarbeJan HladkýJoonkyung Lee
Published in: Discrete & computational geometry (2021)
For a graph  H , its homomorphism density in graphs naturally extends to the space of two-variable symmetric functions W in L p , p ≥ e ( H ) , denoted by  t ( H ,  W ). One may then define corresponding functionals ‖ W ‖ H : = | t ( H , W ) | 1 / e ( H ) and ‖ W ‖ r ( H ) : = t ( H , | W | ) 1 / e ( H ) , and say that H is (semi-)norming if ‖ · ‖ H is a (semi-)norm and that H is weakly norming if ‖ · ‖ r ( H ) is a norm. We obtain two results that contribute to the theory of (weakly) norming graphs. Firstly, answering a question of Hatami, who estimated the modulus of convexity and smoothness of ‖ · ‖ H , we prove that ‖ · ‖ r ( H ) is neither uniformly convex nor uniformly smooth, provided that H is weakly norming. Secondly, we prove that every graph H without isolated vertices is (weakly) norming if and only if each component is an isomorphic copy of a (weakly) norming graph. This strong factorisation result allows us to assume connectivity of  H when studying graph norms. In particular, we correct a negligence in the original statement of the aforementioned theorem by Hatami.
Keyphrases
  • convolutional neural network
  • neural network
  • multiple sclerosis