Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DBN_finetune: layer_input used before set #18

Open
DGDanforth opened this issue Feb 7, 2016 · 0 comments
Open

DBN_finetune: layer_input used before set #18

DGDanforth opened this issue Feb 7, 2016 · 0 comments

Comments

@DGDanforth
Copy link

I converted the C code for DBN to Component Pascal and ran its analyzer over the code. It found in DBN_finetune that the variable "layer_input" was used before any values were set for it. Going back to the C code I see that is indeed true. In fact the array is accessed even before it is allocated.

That has got to be a bug.

Here is the C code

void DBN_finetune(DBN* this, int *input, int *label, double lr, int epochs) {
int i, j, m, n, epoch;

int *layer_input;
// int prev_layer_input_size;
int *prev_layer_input;

int *train_X = (int *)malloc(sizeof(int) * this->n_ins);
int *train_Y = (int *)malloc(sizeof(int) * this->n_outs);

for(epoch=0; epoch<epochs; epoch++) {
for(n=0; nN; n++) { // input x1...xN
// initial input
for(m=0; mn_ins; m++) train_X[m] = input[n * this->n_ins + m];
for(m=0; mn_outs; m++) train_Y[m] = label[n * this->n_outs + m];

  // layer input
  for(i=0; i<this->n_layers; i++) {
    if(i == 0) {
      prev_layer_input = (int *)malloc(sizeof(int) * this->n_ins);
      for(j=0; j<this->n_ins; j++) prev_layer_input[j] = train_X[j];
    } else {
      prev_layer_input = (int *)malloc(sizeof(int) * this->hidden_layer_sizes[i-1]);
      for(j=0; j<this->hidden_layer_sizes[i-1]; j++) prev_layer_input[j] = layer_input[j];
      free(layer_input);
    }


    layer_input = (int *)malloc(sizeof(int) * this->hidden_layer_sizes[i]);
    HiddenLayer_sample_h_given_v(&(this->sigmoid_layers[i]), \
                                 prev_layer_input, layer_input);
    free(prev_layer_input);
  }

  LogisticRegression_train(&(this->log_layer), layer_input, train_Y, lr);
}
// lr *= 0.95;

}

free(layer_input);
free(train_X);
free(train_Y);
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant