Please use this identifier to cite or link to this item: http://117.252.14.250:8080/jspui/handle/123456789/4892
Title: Dissection of trained neural network hydrologic models for knowledge extraction
Authors: Jain, Ashu
Kumar, Sumant
Keywords: Artificial neural networks
ANN models
Issue Date: 2009
Publisher: American Geophysical Union
Citation: WATER RESOURCES RESEARCH, VOL. 45, W07420, 2009
Abstract: Artificial neural networks (ANNs) are powerful tools for the modeling and forecasting of complex engineering systems and have been exploited by researchers to solve a variety of problems over the last couple of decades. In spite of their proven ability to provide superior model performance compared to traditional modeling approaches, they have not become popular among decision makers for operational use. It is probably because of their perceived black box nature that does not explain or consider the underlying physical processes involved. This paper presents the results of a study aimed at a systematic dissection of the massively parallel architectures of trained ANN hydrologic models to determine if they learn the underlying physical subprocesses during training. This has been achieved using simple qualitative and quantitative techniques. The data derived from three contrasting catchments at two different time scales were employed to develop ANN models and test the methodologies employed for knowledge extraction. The results obtained in this study indicate that the number of hidden neurons determined during training for a particular data set correspond to certain subprocesses of the overall physical process being modeled. It has been found that the time scale of the data employed has an effect on optimum ANN architecture and knowledge extracted.
URI: http://117.252.14.250:8080/jspui/handle/123456789/4892
Appears in Collections:Research papers in International Journals

Files in This Item:
File SizeFormat 
Restricted Access.pdf411.81 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.