Random Naive Bayes - Random Naive Bayes and Random Forest

Random Naive Bayes and Random Forest

Generalizing Random Forest to Naive Bayes, Random Naive Bayes (Random NB), is a bagged classifier combining a forest of B Naive Bayes. Each bth Naive Bayes is estimated on a bootstrap sample Sb with m randomly selected features. To classify an observation put the input vector down the B Naive Bayes in the forest. Each Naive Bayes generates posterior class probabilities. Unlike Random Forest, the predicted class of the ensemble is assessed by adjusted majority voting rather than majority voting, as each bth Naive Bayes delivers continuous posterior probabilities. Similar to Random Forests, the importance of each feature is estimated on the out-of-bag (oob) data.

Read more about this topic:  Random Naive Bayes

Famous quotes containing the words forest, random and/or naive:

    The reason is:
    rats leave the sinking ship
    but we . . .
    we . . .
    didn’t leave,
    so the ship
    didn’t sink,
    and that’s madness,
    Lear’s song
    that’s Touchstone’s forest jest,
    that’s swan of Avon logic.
    Hilda Doolittle (1886–1961)

    And catch the gleaming of a random light,
    That tells me that the ship I seek is passing, passing.
    Paul Laurence Dunbar (1872–1906)

    The naive notion that a mother naturally acquires the complex skills of childrearing simply because she has given birth now seems as absurd to me as enrolling in a nine-month class in composition and imagining that at the end of the course you are now prepared to begin writing War and Peace.
    Mary Kay Blakely (20th century)