0
$\begingroup$

I use R to do data analysis. I have a dataset. When I use different classifying algorithms, such as random forest, SVM, etc, I have the different accuracy. So, I want to integrate all the algorithms into one framework, let's say adaboost.

We know that adaboost framework use multiple "weak" classifying algorithms to combine a strong classifier. So, can I customize the "weak" classifying algorithms as I want? Here is just my current idea: In this framework, I use SVM first. Then give the data that are classified incorrectly more weights. Then, I use random forest. ... In the end, all the classifiers in this framework will work together.

This is just what I think about this issue. If there is other method working such as voting, please let me know too.

Any help is appreciated.

$\endgroup$
4
  • $\begingroup$ Adaboost is a boosting algorithm. Where is the relation to bagging? $\endgroup$ Commented Oct 5, 2017 at 13:37
  • $\begingroup$ Thanks for your comment. I changed my question a little bit $\endgroup$ Commented Oct 5, 2017 at 22:21
  • $\begingroup$ It's not quite clear what you mean by "integrating all the algorithms into one framework". Would you consider a voting classifier as such a framework? $\endgroup$ Commented Oct 5, 2017 at 22:27
  • $\begingroup$ For example, I use SVM first. Based on the result, I use random forest to improve the accuracy. If a voting framework works, I am happ to use it too. $\endgroup$ Commented Oct 5, 2017 at 22:34

1 Answer 1

1
$\begingroup$

What you're looking for is called an ensemble model which means it is a compilation of several models to improve the results. This is a very common technique for winners in Kaggle competitions. Since you're using R and caret is a popular way to do ML in R, here's a package just for that purpose on caret:

https://cran.r-project.org/web/packages/caretEnsemble/vignettes/caretEnsemble-intro.html

$\endgroup$
1
  • $\begingroup$ The OP is about sequential application of methods, not (parallel) ensembling. So e.g. one boosting round of linear regression, then one round of random forest, then gbm etc. $\endgroup$ Commented Oct 6, 2017 at 17:26

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.