neural networks

Capture the Bot: Using Adversarial Examples to Improve CAPTCHA Robustness to Bot Attacks

To this date, CAPTCHAs have served as the first line of defense preventing unauthorized access by (malicious) bots to web-based services, while at the same time maintaining a trouble-free experience for human visitors. However, recent work in the literature has provided evidence of sophisticated bots that make use of advancements in machine learning (ML) to easily bypass existing CAPTCHA-based defenses. In this work, we take the first step to address this problem. We introduce CAPTURE, a novel CAPTCHA scheme based on adversarial examples.

Fusing Self-Organized Neural Network and Keypoint Clustering for Localized Real-Time Background Subtraction

Moving object detection in video streams plays a key role in many computer vision applications. In particular, separation between background and foreground items represents a main prerequisite to carry out more complex tasks, such as object classification, vehicle tracking, and person re-identification. Despite the progress made in recent years, a main challenge of moving object detection still regards the management of dynamic aspects, including bootstrapping and illumination changes.

Non-convex Multi-species Hopfield Models

In this work we introduce a multi-species generalization of the Hopfield model for associative memory, where neurons are divided into groups and both inter-groups and intra-groups pair-wise interactions are considered, with different intensities. Thus, this system contains two of the main ingredients of modern deep neural-network architectures: Hebbian interactions to store patterns of information and multiple layers coding different levels of correlations.

Neural Networks with a Redundant Representation: Detecting the Undetectable

We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P = 4. The latter is known to be able to Hebbian store an amount of patterns scaling as NP -1, where N denotes the number of constituting binary neurons interacting P wisely.

Low-dimensional dynamics for working memory and time encoding

Our decisions often depend on multiple sensory experiences separated by time delays. The brain can remember these experiences and, simultaneously, estimate the timing between events. To understand the mechanisms underlying working memory and time encoding, we analyze neural activity recorded during delays in four experiments on nonhuman primates. To disambiguate potential mechanisms, we propose two analyses, namely, decoding the passage of time from neural data and computing the cumulative dimensionality of the neural trajectory over time.

Randomness in neural networks: an overview

Neural networks, as powerful tools for data mining and knowledge engineering, can learn from data to build feature-based classifiers and nonlinear predictive models. Training neural networks involves the optimization of nonconvex objective functions, and usually, the learning process is costly and infeasible for applications associated with data streams. A possible, albeit counterintuitive, alternative is to randomly assign a subset of the networks’ weights so that the resulting optimization task can be formulated as a linear least-squares problem.

Takagi-Sugeno Fuzzy Systems Applied to Voltage Prediction of Photovoltaic Plants

High penetration level of intermittent and variable renewable electricity generation introduces signicant challenges
to energy management of modern smart grids. Solar photovoltaics and wind energy have uncertain and non-dispatchable
output which leads to concerns regarding the technical and economic feasibility of a reliable integration of large amounts of
variable generation into electric grids. In this scenario, accurate forecasting of renewable generation outputs is of paramount

A neural network based prediction system of distributed generation for the management of microgrids

In the modern scenario of smart-grids, the concept of virtual power plant (VPP) is undoubtedly a cornerstone for the smooth integration of renewable energy sources into existing energy systems with a high penetration level. A VPP is the aggregation of decentralized medium-scale power sources, including photovoltaic and wind power plants, combined heat and power units, as well as demand-responsive loads and storage systems, with a twofold objective.

Microgrid energy management systems design by computational intelligence techniques

With the capillary spread of multi-energy systems such as microgrids, nanogrids, smart homes and hybrid electric vehicles, the design of a suitable Energy Management System (EMS) able to schedule the local energy flows in real time has a key role for the development of Renewable Energy Sources (RESs) and for reducing pollutant emissions. In the literature, most EMSs proposed are based on the implementation of energy systems prediction which enable to run a specific optimization algorithm.

Kafnets: Kernel-based non-parametric activation functions for neural networks

Neural networks are generally built by interleaving (adaptable) linear layers with (fixed) nonlinear activation functions. To increase their flexibility, several authors have proposed methods for adapting the activation functions themselves, endowing them with varying degrees of flexibility. None of these approaches, however, have gained wide acceptance in practice, and research in this topic remains open. In this paper, we introduce a novel family of flexible activation functions that are based on an inexpensive kernel expansion at every neuron.

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma