Homomorphic encryption (HE) enables privacy-preserving machine learning by allowing computation directly over encrypted data, and HE-based inference algorithms are already practical even for relatively large Convolutional Neural Networks (CNNs). Encrypted training, on the other hand, is still a major challenge, with current solutions taking up to weeks of computation. In this paper, we introduce a framework for training and inference on Weightless Neural Network (WNN) models over encrypted data. Compared to CNNs, we show (HE-based) WNNs offer much better performance with a relatively small accuracy loss. Our solution is based on new building blocks we introduce for the TFHE scheme that can be of independent interest. We achieve 91.71% accuracy on the MNIST dataset after only 3.5 min of encrypted training (multi-threaded), going up to 93.76% in 3.5 h. Compared to the state of the art on HE-based CNN training, Glyph (Lou et al., NeurIPS 2020), this represents a speedup of up to 1200 times with an accuracy loss of at most 5.4%. For the HAM10000 dataset, we achieve 67.85% accuracy in just 1.5 min, going up to 69.85% after 1 h, which represents a 0.65% accuracy improvement while being 60 times faster than Glyph. We also provide solutions for small-scale encrypted training. In a single thread using less than 200 MB of memory, we train over the entire Wisconsin Breast Cancer dataset in just 11 s achieving 97.3% accuracy.