社区
下载资源悬赏专区
帖子详情
Brain demonstration下载
weixin_39821620
2019-05-24 05:30:16
展示大脑结构的flash,是初学者了解大脑解剖结构的良好帮手。
相关下载链接:
//download.csdn.net/download/zhuhuaping1228/2328252?utm_source=bbsseo
...全文
21
回复
打赏
收藏
Brain demonstration下载
展示大脑结构的flash,是初学者了解大脑解剖结构的良好帮手。 相关下载链接://download.csdn.net/download/zhuhuaping1228/2328252?utm_source=bbsseo
复制链接
扫一扫
分享
转发到动态
举报
写回复
配置赞助广告
用AI写文章
回复
切换为时间正序
请发表友善的回复…
发表回复
打赏红包
Brain
demo
nstrat
ion
展示大脑结构的flash,是初学者了解大脑解剖结构的良好帮手。
C 语言编缉神经网络工具
NEURAL NETWORK PC TOOLS SOFTWARE USER'S GUIDE $Revis
ion
: 1.2 $ $Date: 02 Jan 1990 15:40:54 $ INTRODUCT
ION
The software described in this User's Guide is that described in the chapter on Neural Network PC Tool Implementat
ion
s in the book entitled Neural Network PC Tools: A Practical Guide, to be published by Academic Press in 1990. This software may be copied and distributed AS LONG AS IT IS NOT MODIFIED. In particular, any problems with the source code should be brought to the attent
ion
of the authors. If you use this software, consider it as shareware and please send $10.00 to the authors at the following address: Roy Dobbins, 5833 Humblebee Road, Columbia, MD 21045. As addit
ion
s are made to this software diskette, such as including self-organizing (Kohonen) networks, the price will increase. It is anticipated that the price for the diskette sold in conjunct
ion
with the book will be about $20. BACKGROUND Much excitement exists due to the apparent ability of artificial neural networks to imitate the
brain
's ability to make decis
ion
s and draw conclus
ion
s when presented with complex, noisy and/or partial informat
ion
. This software is for the engineer or programmer who is interested in solving practical problems with neural networks. It is a myth that the only way to achieve results with neural networks is with a mill
ion
dollars, a supercomputer, and an interdisciplinary team of Nobel laureates. There are some commercial vendors out there who would like you to believe that, though. Using simple hardware and software tools, it is possible to solve practical problems that are otherwise impossible or impractical. Neural network tools (NNT's) offer a solut
ion
to some problems that can't be solved any other way known to the authors. THE BACK-PROPAGAT
ION
NNT: BATCHNET This release contains both source and executable code for a "standard" three layer back-propagat
ion
neural network. The executable program is called batchnet.exe; its source code is in the file batchnet.c. The program for generating random weights used as input to the training run is weights.exe; its source code is in weights.c. These files were compiled using Turbo C v 2.0, but can also be compiled in Microsoft C. They were compiled using the 80x87 emulator mode, so that they will run even if you don't have a co-processor. If you have a coprocessor and want batchnet to run faster, which may be especially important in training, you can recompile batchnet.c using the 80x87 opt
ion
. Always use the compact model. To run the batchnet program, you must specify the run file that it will use.
Demo
.run is the run file for the
demo
.bat
demo
nstrat
ion
. Look at the
demo
.bat and
demo
.run files to see what we mean.
Demo
.bat also illustrates one of the opt
ion
s for batchnet. You can specify the interval of iterat
ion
s between average sum-squared error printouts with the -e opt
ion
: -e10 prints it out each 10 iterat
ion
s. The default number of iterat
ion
s between error printouts is 100. The other opt
ion
for batchnet is to specify what average sum-squared error (per output node and per pattern) is required for the program to terminate training. The default value is 0.02: a command of -d.01 will override this with an error value of .01. In the run file, you specify a number of things. Look at
demo
.run in detail to see what they are; there is explanat
ion
following the run data for the two runs that tell what goes where. First, you specify the number of runs. The
demo
has two. This is fairly typical. You often have a training run followed by a test run, as is the case in the
demo
. You can, however, set up the software to do as many runs as you want: hence the name "batchnet". You then specify the filenames for a number of files: the output file that gives the values of the output nodes for each pattern on the last iterat
ion
(or the only iterat
ion
, if you are in testing mode and there is only one iterat
ion
), the error file that gives you the average sum squared error value each specified number of iterat
ion
s, the source pattern file (values normalized between 0 and 1), the input weights file (generated by weights.exe for a training run, and consisting of the output weights file from training for a testing run), and the output weights file which gives you weight values after the last iterat
ion
. Note that the pattern files have values for each input node followed by values for each output node followed by an ID field that you can use to identify each pattern in some way. The input and output node values should be between 0 and 1. Following filenames, you specify, for each run, the number of input patterns, the number of epochs (iterat
ion
s of entire pattern set), the number of input nodes, number of hidden nodes, number of output nodes, the value for the learning coefficient (eta), and the value for the momentum factor (alpha). The number of epochs varies a lot during training, but often is in the range of 100-1000; during testing, you only do one iterat
ion
. Sample files are given that you can run with
demo
.bat; the output files you will get when you run the
demo
are already on the diskette as mytest.out, mytrain.out, mytrain.wts, mytest.wts, mytrain.err, and mytest.err. You will get similar files without the "my" prefix when you run the
demo
.bat program, and you can compare corresponding files to see that they are the same. All you have to do is run "
demo
.bat" in order to both train and test the batchnet artificial neural network on the patterns in the train.pat and test.pat files. These pattern files are built from actual electroencephalogram (EEG) spike parameter data, and illustrate the use of a parameter-based NNT. The training phase of the
demo
.bat will probably take about 45 minutes on a 4.77 MHz 8088 PC with coprocessor. A 12 MHz Compaq with coprocessor takes about 18 minutes. A 10 MHz Grid 80286 Laptop with no coprocessor takes about 140 minutes. The coprocessor makes the difference! HINTON DIAGRAMS Overview This program displays Hinton diagrams - graphical representat
ion
s of neural network weights. The program assumes that the weights for a three layer network have been stored in a disk file as ASCII floating point numbers. An example of a valid weights file that you have on this shareware diskette is mytrain.wts. System Requirements You need a PC with EGA or VGA to run this. We have never tried it on a CGA, but in theory you should be able to get something there too. Ensure that the necessary driver files are all present in the directory from which HINTON.EXE is run: HINTON.EXE EGAVGA.BGI CGA.BGI Use To use the program, at the DOS prompt type: hinton {-c} datafile input hidden output -c no color datafile name of data file input number of units in input layer hidden number of units in hidden layer output number of units in output layer Use the -c opt
ion
if you have a monochrome screen or if you want to make hardcopies of the screen. Currently HINTON.EXE only works with three layer feedforward networks. Data File Organizat
ion
The file must be in the form of ASCII text floating point numbers, in the order given below: data_file is :- input_layer_to_hidden_layer_weights hidden_layer_to_output_layer_weights input_layer_to_hidden_layer_weights is:- weights_for_hidden_unit 0 weights_for_hidden_unit 1 weights_for_hidden_unit 2 ... weights_for_hidden_unit h-1 hidden_layer_to_output_layer_weights is :- weights_for_output_unit_0 weights_for_output_unit_1 weights_for_output_unit_2 ... weights_for_output_unit_o-1 weights_for_hidden_unit_n is :- weight from input unit 0 to hidden unit n weight from input unit 1 to hidden unit n weight from input unit 2 to hidden unit n ... weight from input unit i-1 to hidden unit n weight from bias unit to hidden unit n weights_for_output_unit_n is :- weight from hidden unit 0 to output unit n weight from hidden unit 1 to output unit n weight from hidden unit 2 to output unit n ... weight from hidden unit h-1 to output unit n weight from bias unit to output unit n Note that although you must have the weights from the bias units present in the file, the current vers
ion
of hinton.exe does not portray the bias weights. This will be changed in the next vers
ion
of hinton.exe. Menu The main menu consists of the following commands, displayed in a bar at the bottom of the screen: Hidden Out View Clear Zoom Shrink Flip Unit Range Quit Brief descript
ion
of commands: Hidden Activate the hidden layer window. Does not alter the display, but all future commands are directed to this window (A later vers
ion
of HINTON.EXE will give a positive indicat
ion
of the activated window). Out Activate the output layer window. View Display (or re-display) the data in the current window. Values are displayed as small filled rectangles. The area of a rectangle is proport
ion
al to the magnitude, while the color and fill pattern indicate the sign. Currently, positive numbers are displayed in white while negative numbers are displayed in color, the color varying from layer to layer - blue for hidden, red for output. Printscreen Not a menu command. To get a hardcopy of the Hinton diagrams, you must load your favorite hot key utility for your CGA, EGA or VGA screen. Furthermore, to get a good representat
ion
on a black and white printer, you should run hinton -c to suppress on-screen color informat
ion
, as indicated in the command line descript
ion
. Clear Clear the current window (does not alter the data in any way; merely erases the display window). Zoom Increase the magnificat
ion
of the current window. The opposite of Shrink. Data is scaled up and appears larger in the window. It is possible that some of the image will be clipped, if it now falls outside the window boundaries. Shrink Decrease the magnificat
ion
of the current window. The opposite of Zoom. Data is scaled down and appears smaller in the window. The minimum shrinkage is down to the level of one pixel, after which further Shrink commands are ignored. Flip Turn the image in the window through 90 degrees. Horizontally organized data is displayed vertically and vice versa. This command acts as a toggle. A subsequent Flip command rotates the image back through 90 degrees back to its original orientat
ion
. Unit Specify the unit(s) of the current layer, that are to be displayed in the window. The default is all units. You can enter a single unit or you can enter a range as a pair of numbers. For example, to display units 10 through 15, enter: 10 15 Range Specify the units of the input to the current layer that are to be dislayed in the window. The default is all input units. You can enter a single unit or you can enter a range as a pair of numbers. Quit Quit the program. The mode of the screen is changed from graphics back to the normal text mode of the PC. Simple Example of Running the Hinton Program To look at the weight values in the file "mytrain.wts", which has weights for a 9-4-2 node backprop network, just run the hinton.exe program with the following command: hinton mytrain.wts 9 4 2 You then see a blank screen with the prompts below. You are by default in the hidden layer window that shows the weights from the input to the hidden layer. To see the hidden weights, hit the V key, for view. Hit Z for zoom to enlarge the weights. To see the weights in the output layer, hit O for output layer. This puts you in the output window. Then hit V for view; Z for zoom, etc.
brain
.js 时间序列_免费的
Brain
JS课程学习JavaScript中的神经网络
brain
.js 时间序列The last few years, machine learning has gone from a promising technology to something we’re surrounded with on a daily basis. And at the heart of many machine learning systems lies neura...
The
Brain
as a Universal Learning Machine
The
Brain
as a Universal Learning Machine This article presents an emerging architectural hypothesis of the
brain
as a biological implementat
ion
of aUniversal Learning Machine. I present a rough bu...
GlusterFS Split-
Brain
Recovery Made Easy
https://joejulian.name/blog/glusterfs-split-
brain
-recovery-made-easy/ GlusterFS Split-
Brain
Recovery Made Easy Home >>GlusterFS Split-
Brain
Recovery Made Easy Posted by Joe Julian 1 ye
下载资源悬赏专区
12,796
社区成员
12,335,239
社区内容
发帖
与我相关
我的任务
下载资源悬赏专区
CSDN 下载资源悬赏专区
复制链接
扫一扫
分享
社区描述
CSDN 下载资源悬赏专区
其他
技术论坛(原bbs)
社区管理员
加入社区
获取链接或二维码
近7日
近30日
至今
加载中
查看更多榜单
社区公告
暂无公告
试试用AI创作助手写篇文章吧
+ 用AI写文章