Economic crisis and public education. A productivity analysis using a Hicks-Moorsteen index (2018). Economic Modelling, 71, 34–44.

Juan Aparicio (University Miguel Hernandez of Elche), Laura López-Torres (University of Alcalá) and Daniel Santín (Complutense University of Madrid).

Abstract. The economic crisis forced politicians to make public finances sustainable. The education sector was one of the most adversely affected by control of public expenditure. This paper analyzes the drivers causing productivity changes of especially vulnerable public schools during the crisis. We use the Hicks-Moorsteen index, which is a seldom applied methodology that leads to feasible results under variable returns to scale. To illustrate the benefits of this index, we use a sample of 298 Catalan public primary schools between 2009 (when budgetary constraints started) and 2014. The results reveal that during the crisis schools improved their total factor productivity by raising academic achievement despite cutbacks in resources. We also found that there was a strong convergence pattern during the financial crisis, driven by the catch-up process of some schools. The findings have important policy implications, suggesting that a monitoring system should be set up for use by policy makers.

Serial concatenation of a block code and a 2D convolutional code (2018). Multidimensional Systems and Signal Processing, 29, 1-15.

Victoria Herranz (Universidad Miguel Hernández), Diego Napp (Universidade de Aveiro Campus Universitário de Santiago) and Carmen Perea (Universidad Miguel Hernández).

Abstract. In this paper we study two different concatenation schemes of twodimensional (2D) convolutional codes. We consider Fornasini–Marchesini state space representation of 2D linear systems to describe our concatenated codes. Also we present upper and lower bounds on the distance of the proposed concatenated codes.

ENMX: An elastic network model to predict the FOREX market evolution (2018). Simulation Modelling Practice and Theory, 86, 1-10.

Antonio V. Contrerasa (Universidad Católica de Murcia), Antonio Llanes (Universidad Católica de Murcia), Alberto Pérez-Bernabeu (Miguel Hernandez University), Sergio Navarro (Artificial Intelligence Talentum, S.L., Campus Universitario de Espinardo de Murcia), Horacio Pérez-Sánchez (Universidad Católica de Murcia), Jose J. López-Espín (Miguel Hernandez University), José M. Cecilia (Universidad Católica de Murcia).

Abstract. The foreign exchange (FOREX) market is a financial market in which participants, such as international banks, companies or private investors, can both invest in and speculate on exchange rates. This market is considered one of the largest financial markets in the world in terms of trading volume. Indeed, the just-in-time price prediction for a currency pair exchange rate (e.g. EUR/USD) provides valuable information for companies and investors as they can take different actions to improve their business. This paper introduces a new algorithm, inspired by the behaviour of macromolecules in dissolution, to model the evolution of the FOREX market, called the ENMX (elastic network model for FOREX market) algorithm. This algorithm allows the system to escape from a potential local minimum, so it can reproduce the unstable nature of the FOREX market, allowing the simulation to get away from equilibrium. ENMX introduces several novelties in the simulation of the FOREX market. First, ENMX enables the user to simulate the market evolution of up to 21 currency pairs, connected, and thus emulating behaviour of the realworld FOREX market. Second, the interaction between investors and each particular quotation, which may introduce slight deviations from the quotation prices, is represented by a random movement. We analyse different probability distributions like Gaussian and Pseudo-Voigt, the latter showing better behaviour distributions, to model the variations in quotation prices. Finally, the ENMX algorithm is also compared to traditional econometric approaches such as the VAR model and a driftless random walk, using a classical statistical and a profitability measure. The results show that the ENMX outperforms both models in terms of quality by a wide margin.

On preparedness resource allocation planning for natural disaster relief under endogenous uncertainty with time-consistent risk-averse management (2018). Computers and Operations Research, 98, 84-102.

Laureano F. Escudero (Universidad Rey Juan Carlos), M. Araceli Garín (Universidad del País Vasco), Juan F. Monge (Universidad Miguel Hernández) and Aitziber Unzueta (Universidad del País Vasco).

Abstract. A preparedness resource allocation model and an algorithmic approach are presented for a three-stage stochastic problem for managing natural disaster mitigation. That preparedness consists of warehouse location and capacity assignment and the procurement of commodities on the one hand and refurbishing the rescue network infrastructure on the other. Two types of uncertainty are considered: exogenous uncertainty which is due to the lack of full knowledge about the probability and intensity of the disaster for each focal point in a given network; and endogenous uncertainty which is based on the decision-maker’s investment to obtain greater accuracy in regard to the occurrence of the disaster and to reinforcing the network infrastructure. A stochastic mixed 0-1 bilinear optimization model is presented. Additionally, a time-consistent stochastic dominance-based risk-averse measure for a set of profiles in a multifunction setting is introduced. Both types of elements imply large-sized problems, so some kind of decomposition algorithmic should be used. Based on the special features of the three-stage problem subject of this work, we introduce the Cluster Dual Descent Algorithm for obtaining feasible solutions based on duality theory. Computational results are reported for a well-known real-life case by comparing the performance of the models based on the alternatives given by the risk-neutral and risk-averse versions jointly with exogenous and endogenous uncertainty.

Using non-radial DEA to assess school efficiency in a cross-country perspective: An empirical analysis of OECD countries (2018). Omega, 79, 9-20.

Juan Aparicio (University Miguel Hernandez of Elche), José M. Cordero (University of Extremadura), Martín González (University Miguel Hernandez of Elche) and José J. López-Espín (University Miguel Hernandez of Elche).

Abstract. In this paper we use data from OECD countries participating in PISA 2012 to assess the efficiency of schools in a cross-country framework. In the analysis, and in contrast to previous applications, we consider that schools might concentrate their effort s on improving the results in one dimension of the edu- cational output to a greater extent than in the other. To do this, we rely on non-radial efficiency measures of performance and the estimation of an educational production function based upon Data Envelopment Analysis (DEA) techniques. Specifically, DEA non-radial measures allow for identifying different levels of inefficiency for each output considered (reading and maths). In particular, we apply a non-radial measure based on Ando et al. [5] and Aparicio et al. [12]. Our results show that the majority of schools in OECD countries tend to be less efficient in reading than in mathematics.

On capacity expansion planning under strategic and operational uncertainties based on stochastic dominance risk averse management (2018). Computational Management Science, 15, 1-22.

Laureano F. Escudero (Universidad Rey Juan Carlos) and Juan F. Monge (Universidad Miguel Hernández).

Abstract. A new scheme for dealing with uncertainty in scenario trees is presented for dynamic mixed 0–1 optimization problems with strategic and operational stochastic parameters. Let us generically name this type of problems as capacity expansion planning (CEP) in a given system, e.g., supply chain, production, rapid transit network, energy generation and transmission network, etc. The strategic scenario tree is usually a multistage one, and the replicas of the strategic nodes root structures in the form of either a special scenario graph or a two-stage scenario tree, depending on the type of operational activity in the system. Those operational scenario structures impact in the constraints of the model and, thus, in the decomposition methodology for solving usually large-scale problems. This work presents the modeling framework for some of the risk neutral and risk averse measures to consider for CEP problem solving. Two types of risk averse measures are considered. The first one is a time-inconsistent mixture of the chance-constrained and second-order stochastic dominance (SSD) functionals of the value of a given set of functions up to the strategic nodes in selected stages along the time horizon, The second type is a strategic node-based time-consistent SSD functional for the set of operational scenarios in the strategic nodes at selected stages. A specialization of the nested stochastic decomposition methodology for that problem solving is outlined. Its advantages and drawbacks as well as the framework for some schemes to, at least, partially avoid those drawbacks are also presented.