1

On Modal Clustering with Gaussian Sum-Product Networks

Jan 1, 2023

Tractable Mode-Finding in Sum-Product Networks with Gaussian Leaves

Jan 1, 2022

Tractable Classification with Non-Ignorable Missing Data Using Generative Random Forests

Jan 1, 2022

Time Robust Trees: Using Temporal Invariance to Improve Generalization

Jan 1, 2022

Integrating Question Answering and Text-to-SQL in Portuguese

Jan 1, 2022

Exploration Versus Exploitation in Model-Based Reinforcement Learning: An Empirical Study

Jan 1, 2022

Differentiable Planning with Indefinite Horizon

Jan 1, 2022

Differentiable Planning for Optimal Liquidation

Jan 1, 2022

Learning Probabilistic Sentential Decision Diagrams under Logic Constraints by Sampling and Averaging

Probabilistic Sentential Decision Diagrams (PSDDs) are effective tools for combining uncertain knowledge in the form of (learned) probabilities and certain knowledge in the form of logical constraints. Despite some promising recent advances in the topic, very little attention has been given to the problem of effectively learning PSDDs from data and logical constraints in large domains. In this paper, we show that a simple strategy of sampling and averaging PSDDs leads to state-of-the-art performance in many tasks. We overcome some of the issues with previous methods by employing a top-down generation of circuits from a logic formula represented as a BDD. We discuss how to locally grow the circuit while achieving a good trade-off between complexity and goodness-of-fit of the resulting model. Generalization error is further decreased by aggregating sampled circuits through an ensemble of models. Experiments with various domains show that the approach efficiently learns good models even in very low data regimes, while remaining competitive for large sample sizes.

Jan 1, 2021

Fast And Accurate Learning of Probabilistic Circuits by Random Projections

Jan 1, 2021