哺乳动物 发表于 2025-3-26 22:52:38
http://reply.papertrans.cn/39/3837/383603/383603_31.png假装是你 发表于 2025-3-27 01:37:25
A Neurogeometric Stereo Model for Individuation of 3D Perceptual Units3D curves in the visual scene as a sub-Riemannian structure. Horizontal curves in this setting express good continuation principles in 3D. Starting from the equation of neural activity we apply harmonic analysis techniques in the sub-Riemannian structure to solve the correspondence problem and findGUMP 发表于 2025-3-27 08:15:37
Functional Properties of PDE-Based Group Equivariant Convolutional Neural Networksving HJB-PDEs on lifted homogeneous spaces such as the homogeneous space of 2D positions and orientations isomorphic to .. PDE-G-CNNs generalize G-CNNs and are provably equivariant to actions of the roto-translation group .(2). Moreover, PDE-G-CNNs automate geometric image processing via orientationHALL 发表于 2025-3-27 13:19:47
Continuous Kendall Shape Variational Autoencoderssentations. The equivariant encoder/decoder ensures that these latents are geometrically meaningful and grounded in the input space. Mapping these geometry-.grounded latents to hyperspheres allows us to interpret them as points in a Kendall shape space. This paper extends the recent . paradigm by VaFOR 发表于 2025-3-27 14:07:27
Can Generalised Divergences Help for Invariant Neural Networks?uring training of convolutional neural networks. Experiments on supervised classification of images at different scales not considered during training illustrate that our proposed method performs better than classical data augmentation.UTTER 发表于 2025-3-27 19:28:34
Group Equivariant Sparse Coding a group-equivariant convolutional layer with internal recurrent connections that implement sparse coding through neural population attractor dynamics, consistent with the architecture of visual cortex. The layers can be stacked hierarchically by introducing recurrent connections between them. The h拖债 发表于 2025-3-27 23:58:15
On a Cornerstone of Bare-Simulation Distance/Divergence Optimization high-dimensional optimization problems on directed distances (divergences), under very non-restrictive (e.g. non-convex) constraints. Such a task can be comfortably achieved by the new . method of [., .]. In the present paper, we give some new insightful details on one cornerstone of this approach.muffler 发表于 2025-3-28 04:49:12
Extensive Entropy Functionals and Non-ergodic Random Walksasymptotically) proportional to .". According to whether the focus is on the system or on the entropy, an entropy is extensive for a given system or a system is extensive for a given entropy. Yet, exhibiting the right classes of random sequences that are extensive for the right entropy is far from bsurrogate 发表于 2025-3-28 06:46:37
Empirical Likelihood with Censored Dataconfidence regions and tests for the parameter of interest, by means of minimizing empirical divergences between the considered models and the Kaplan-Meier empirical measure. This approach leads to a new natural adaptation of the empirical likelihood method to the present context of right censored dRADE 发表于 2025-3-28 11:47:36
http://reply.papertrans.cn/39/3837/383603/383603_40.png