From crabapple.srv.cs.cmu.edu!cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!noc.near.net!howland.reston.ans.net!usenet.ins.cwru.edu!gatech!usenet.ufl.edu!usenet.cis.ufl.edu!hkim Mon Jul 26 12:28:20 EDT 1993 Article: 11505 of comp.ai.neural-nets Xref: crabapple.srv.cs.cmu.edu comp.ai.neural-nets:11505 Path: crabapple.srv.cs.cmu.edu!cantaloupe.srv.cs.cmu.edu!das-news.harvard.edu!noc.near.net!howland.reston.ans.net!usenet.ins.cwru.edu!gatech!usenet.ufl.edu!usenet.cis.ufl.edu!hkim From: hkim@reef.cis.ufl.edu (Hyeoncheol Kim) Newsgroups: comp.ai.neural-nets Subject: CMAC code Date: 24 Jul 1993 19:23:57 GMT Organization: Univ. of Florida CIS Dept. Lines: 236 Distribution: world Message-ID: <22s28dINNptq@snoopy.cis.ufl.edu> NNTP-Posting-Host: reef.cis.ufl.edu Somebody wanted a CMAC code a couple of days ago in this newsgroup. It was posted on July 21st already. Since I saved it, I'm reposting the code again here. Enjoy... Hyeoncheol Kim hkim@cis.ufl.edu Computer and Information Sciences University of Florida =================================================================== Article: 10806 of comp.ai.neural-nets Newsgroups: comp.ai.neural-nets From: md5801@ehsn6.cen.uiuc.edu (Michael T Dunphy) Subject: Re: WANTED: CMAC code. Date: Wed, 21 Jul 1993 01:39:39 GMT Keywords: CMAC neural nets, C. Organization: University of Illinois at Urbana Lines: 221 Here's some code. (It isn't mine; I don't remember where I picked it up). /* CMAC - Cerebellum Model Articulation Controller as described in AI Expert, June 1992, pp. 32-41 and Transactions of ASME, Sept. 1975, pp. 220-227 __________________ \ | |---|**** |----------| \ | |---|AND *--| x Weight |--+ \ >---| |---|**** |----------| | X Input1-* X | | | X >---| | | X Input2-* X | | : | / >---| | | / | | | / | | : | | Interconnect | | \ | Matrix | | \ | | : | |-----------| \ >---| | +-+ | Y Input1-* X | |---|**** |----------| | | Output X >---| |---|AND *--| x Weight |----+ Output +-------> Y Input2-* X | |---|**** |----------| | Summation | / >---| | +-+ | / | | : | |-----------| / | | | | | | \ | | : | \ | | | \ >---| | | Z Input1-* X | | | X >---| | : | Z Input2-* X | | | / >---| | | / | |---|**** |----------| | / | |---|AND *--| x Weight |--+ | |---|**** |----------| ------------------ ^ trainable weight vector ^ AND gates (eg. dimension=3) ^connection matrix (connects inputs to AND gates) ^overlapping input sensors (eg. width=2 inputs/sensor) ^inputs (eg. quant=2 bits/dimension) ^dimension (eg. 3) Notes: 1. Only one input per dimension can be active (= 1) at any time. Input values must be quantized into 1 of "quant" values. 2. Input sensors overlap and cover to "1 to width" number of inputs. Width can vary between 1 to "quant". Low numbers usually work best. 3. Interconnect matrix is such that each input vector (eg. X,Y,Z) activates exactly "width" number of AND gates. 4. Each AND gate has "dimension" number of inputs. 5. Output is a summation of weights corresponding to activated AND gates. 6. Weights are trained using the Delta rule. 7. CMAC converges very rapidly. Three bit parity example can be solved to an accuracy of .001 in about 20 training passes. 8. All nonlinearity comes from input mapping instead, of sigmoid function like FFNNs. 9. Visualize input vectors as locations in an N-dimensional space. The Output is then the value of the function at that location. 10. Interpolation between multiple outputs can be added to reduce effect of quantization. 11. Once the weights have been trained, the compute_output routine can be used alone to determine output quickly. 12. CMAC can be used to find approximations to N-dimensional nonlinear functions like sqrt or distance calculations quickly. 13. CMAC has been used successfully to linearize transducers or to form the inverse function of unknown plant dynamics. */ #include #include /* User selectable values */ #define dimension 3 /* input dimensions */ #define quant 2 /* input quantization per dimension */ #define width 2 /* width of input sensors */ #define max_gates 10 /* ==> set to (quant**dimension)+width) */ #define beta 0.4 /* learning rate for weight training */ #define err_limit 0.001 /* maximum error for any input */ #define k1 (quant+width-1) /* intermediate calculation */ #define max(a,b) ((a>=b)?a:b) /* max macro */ #define min(a,b) ((a<=b)?a:b) /* min macro */ struct list_of_conns{ int n; /* number of gates in list */ int gate_ptr[(max_gates)]; /* indices of gates */ } conn[quant][dimension], /* connections between inputs and gates */ activated, /* list of activated gates */ *c; /* pointer to list_of_conns */ int n_inputs; /* number of possible input vectors */ int n_sensors; /* number of possible input sensors */ int input[dimension]; /* input vector */ int nbr_gates = 0; /* num of gates used */ int gate[max_gates]; /* AND gates */ double weight[max_gates]; /* weight matrix */ double output, desired, error, total_error; void form_interconnects(); void compute_output(); void train(); int main() { register int i, j, k; int pass=1; double max_error=err_limit; printf("\n\n CMAC \n\n"); n_inputs=(int)pow((double)quant,(double)dimension); n_sensors=(int)pow((double)(quant+width-1),(double)dimension); form_interconnects(); printf("cmac: finished interconnects, beginning training.\n"); while(max_error>=err_limit){ /* for each pass */ total_error=0.0; max_error=0.0; for(i=0;in=0; for (k=0; kgate_ptr[(c->n)++]=nbr_gates; } } nbr_gates++; if(nbr_gates > max_gates){ printf("cmac: error, maximum number of gates exceeded!\n"); exit(3); /* increase #define max_gates */ } } } } } void compute_output() /* usable during and after training */ { register int g, i, j; activated.n=0; /* initialization */ output=0; for(g=0;gn;j++){ /*increment all gates in list */ g=c->gate_ptr[j]; gate[g]++; if(((i+1)==dimension)&&(gate[g]==dimension)){ /* if activated */ /* generate list of activated gates */ activated.gate_ptr[activated.n++]=g; output += weight[g]; /* update output */ } } } } void train() /* compute error and modify weights */ { register int i; error=desired-output; for(i=0;i