Batch normalization (BN) is a technique used to enhance training speed and generalization performance by mitigating internal covariate shifts. However, implementing BN in hardware presents challenges due to the need for an additional complex circuit to normalize, scale and shift activations. We proposed a hardware binary neural network (BNN) system capable of BN in hardware, which is consist of an AND-type flash memory array as a synapse and a voltage sense amplifier (VSA) as a neuron. In this system, hardware BN was implemented using a voltage shifter by adjusting the threshold of the binary neuron. To validate the effectiveness of the proposed hardware-based BNN system, we fabricated a charge trap flash with a gate stack of SiO 2