Run Code
|
API
|
Code Wall
|
Misc
|
Feedback
|
Login
|
Theme
|
Privacy
|
Patreon
888 BONUS LAB 1
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % ELE 888/ EE 8209: LAB 1: Bayesian Decision Theory %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% function [posteriors_x,g_x]=lab1(x,Training_Data,y) %% % x = individual sample to be tested (to identify its probable class label) % featureOfInterest = index of relevant feature (column) in Training_Data % Train_Data = Matrix containing the training samples and numeric class labels % posterior_x = Posterior probabilities % g_x = value of the discriminant function D=Training_Data; % D is MxN (M samples, N columns = N-1 features + 1 label) [M,N]=size(D); %TRYING TO GET THE BONUS TO WORK %% %%%%Prior Probabilities%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Hint: use the commands "find" and "length" %% Bonus f=D(:,1:2); % feature samples la=D(:,N); % class labels [dee,col]=size(la); sigma=cov(f); %% %%%%%Class-conditional probabilities%%%%%%%%%%%%%%%%%%%%%%% Pr1 = sum(la==1)/length(la); Pr2 = sum(la==2)/length(la); %% disp('Mean & Std for class 1 & 2'); mu1 = mean(f(la==1)) % mean of the class conditional density p(x/w1) mu2 = mean(f(la==2)) % mean of the class conditional density p(x/w2) lsigmal = det(sigma); isigma=inv(sigma); feature = x(1); %changed from y feature2 = x(2); %disp(['Conditional probabilities for x=' num2str(x)]); cp11 = 1/((2*pi)^((length(sigma))/2)*(lsigmal^0.5))*exp((-1/2)*(transpose(feature-mu1))*isigma*(feature-mu1));% use the above mean, std and the test feature to calculate p(x/w1) cp21 = 1/((2*pi)^((length(sigma))/2)*(lsigmal^0.5))*exp((-1/2)*(transpose(feature2-mu2))*isigma*(feature2-mu2)); cp12 = 1/((2*pi)^((length(sigma))/2)*(lsigmal^0.5))*exp((-1/2)*(transpose(feature-mu2))*isigma*(feature-mu2)); % use the above mean, std and the test feature to calculate p(x/w2) cp22 = 1/((2*pi)^((length(sigma))/2)*(lsigmal^0.5))*exp((-1/2)*(transpose(feature2-mu1))*isigma*(feature2-mu1)); %% disp('Posterior prob. for the test feature'); Px = cp11*Pr1 + cp12*Pr2; Px2 = cp21*Pr1 + cp22*Pr2; pos11 = cp11*Pr1/Px % p(w1/x) for the given test feature value pos21 = cp21*Pr1/Px2 pos12 = cp12*Pr2/Px % p(w2/x) for the given test feature value pos22 = cp22*Pr2/Px2 posteriors_x = [pos11, pos12, pos21, pos22]; plot(pos11,pos12);hold on %figure; plot(pos21,pos22); %% disp('Discriminant function for the test feature'); [C,I]= min(posteriors_x); g_x = I % compute the g(x) for min err rate classifier.
run
|
edit
|
history
|
help
0
Program 1 - 2D array and memory pointers
joseph a triangle
sum of min and max
18BCE2182 LAB FAT-2-A-ii
hwk2
K&R/1_3
Array Incrementing by 1
Exploiting uninitialized variable 2
A_141124Burbuja
Pregunta de función cantidad de digitos