A Sheaf Neural Network (SNN) is a type of Graph Neural Network (GNN) that operates on a sheaf, an object which equips a graph with vector spaces over nodes and edges and linear maps between them. This type of model has been shown to have useful theoretical properties that help tackle issues arising from heterophily and over-smoothing. One complication intrinsic to SNNs is computing the sheaf, which in general is unknown and learned end-to-end using standard gradient-based approaches. In this work, we propose a novel way of computing sheaves drawing inspiration from Riemannian geometry: we leverage the manifold assumption to compute manifold-and-graphaware orthogonal maps, which optimally align the tangent spaces of each data point. We show that this approach achieves promising results with less computational overhead when compared to previous SNN models. Overall, this work provides an interesting connection between algebraic topology and differential geometry, and we hope that it will spark future research in this direction.