Run Code
|
API
|
Code Wall
|
Misc
|
Feedback
|
Login
|
Theme
|
Privacy
|
Patreon
Multinomial Naive Bayes
Language:
Ada
Assembly
Bash
C#
C++ (gcc)
C++ (clang)
C++ (vc++)
C (gcc)
C (clang)
C (vc)
Client Side
Clojure
Common Lisp
D
Elixir
Erlang
F#
Fortran
Go
Haskell
Java
Javascript
Kotlin
Lua
MySql
Node.js
Ocaml
Octave
Objective-C
Oracle
Pascal
Perl
Php
PostgreSQL
Prolog
Python
Python 3
R
Rust
Ruby
Scala
Scheme
Sql Server
Swift
Tcl
Visual Basic
Layout:
Vertical
Horizontal
import collections as cl import numpy as np class MultinomialNB(object): def __init__(self): self.X = None self.y = None def __loading(self): self.list_labels = cl.Counter(self.Y) for label in self.list_labels: temp = sum(self.X[self.Y==label]) + 1 self.list_labels[label] = [ self.list_labels[label] / len(self.Y), temp / sum(temp) ] def fit(self, X, y): self.X = X self.Y = y self.__loading() def predict(self, X): for x in X: _sum = 0 for i in self.list_labels: temp = self.list_labels[i] self.list_labels[i] = ( temp[0] * np.prod(temp[1] ** x) ) _sum += self.list_labels[i] yield { item: self.list_labels[item] / _sum for item in self.list_labels } d1 = [2, 1, 1, 0, 0, 0, 0, 0, 0] d2 = [1, 1, 0, 1, 1, 0, 0, 0, 0] d3 = [0, 1, 0, 0, 1, 1, 0, 0, 0] d4 = [0, 1, 0, 0, 0, 0, 1, 1, 1] train_data = np.array([d1, d2, d3, d4]) label = np.array(['B', 'B', 'B', 'N']) mnb = MultinomialNB() mnb.fit(train_data, label) print(list(mnb.predict([np.array([0, 1, 0, 0, 0, 0, 0, 1, 1])])))
[
+
]
Show input
edit mode
|
history
|
discussion