High-order phenomena are pervasive across complex systems, yet their formal characterisation remains a formidable challenge. The literature provides various information-theoretic quantities that capture high-order interdependencies, but their conceptual foundations and mutual relationships are not well understood. The lack of unifying principles underpinning these quantities impedes a principled selection of appropriate analytical tools for guiding applications. Here we introduce entropic conjugation as a formal principle to investigate the space of possible high-order measures, which clarifies the nature of the existent high-order measures while revealing gaps in the literature. Additionally, entropic conjugation leads to notions of symmetry and skew-symmetry which serve as key indicators ensuring a balanced account of high-order interdependencies. Our analyses highlight the O-information as the unique skew-symmetric measure whose estimation cost scales linearly with system size, which spontaneously emerges as a natural axis of variation among high-order quantities in real-world and simulated systems. (Figure presented.)