All Questions
32 questions
1
vote
1
answer
2k
views
Difference between Logistic Regression and Decision Trees
I was Studying Decision Trees and understood that it generally is used in Classification Problems.
But the Logistic Regression is also used in Classification Problems only.
So I searched everywhere on ...
-1
votes
1
answer
48
views
Similar validation accuracy for sparse and non sparse dataset in case of decision trees
The blog https://www.kdnuggets.com/2021/01/sparse-features-machine-learning-models.html mentions that the decision tree overfits the data in the case when we have sparse features.
To understand the ...
0
votes
1
answer
5k
views
Calculating Entropy in a decision tree
I am finding it difficult to calculate entropy and information gain for ID3 in the scenario that there are multiple possible classes and the parent class has a lower entropy that the child.
Let me use ...
1
vote
2
answers
1k
views
Reusing a feature to split regression decision tree's nodes
I was left with a little question by the end of a video I've watched about regression tree algorithm: when some feature of the dataset has the threshold with the lower value for the sum of squared ...
-1
votes
1
answer
448
views
How to calculate the information gain and entropy of a dataset with ten features?
I have a dataset of 10K, and I created the following ten features:
Distance - (0 or 1)
IsPronoun - (True or False)
String Match - (True or False)
Demonstrative NP - (True if i and j is demonstrative ...
0
votes
0
answers
202
views
Different Decision tree pruning methods
I am trying to learn different pruning methods for decision tree. I got list of methods,below is the list.
1.Reduced Error Pruning
2.Cost Complexity pruning
3.Minimum error pruning
4.Pessimistic ...
2
votes
1
answer
260
views
Possible Algorithms for Random Forest
I am doing research about Random Forests and I was searching for Algorithms for Random Forests.
I have already looked up Algorithms for Decision Trees (like ID3, C4.5, CART).
But what are different ...
2
votes
2
answers
228
views
“help” decision tree by tying 2 features together
Assuming I have in my dataset 2 (or more) features that are for sure linked (for example: feature B indicates the amount of relevance of feature A), is there a way I could design a decision tree that ...
0
votes
0
answers
775
views
Join decision trees models into one decision tree
I have five decision trees for five datasets. I want to combine them all into one decision tree.
I believe It is something similar to bagging technique. It would be great if experts post few links ...
0
votes
1
answer
86
views
how to define for in rweka- InfoGainAttributeEval
is there anybody can tell me how to define the formula in the rweka?
A<- InfoGainAttributeEval(formula ~ . , data = TrainDataLSVT,na.action=NULL )
there are 310 features in the TrainDataLSVT.
1
vote
1
answer
77
views
Are the rules generated by decision tree learner algorithm correlated?
I have been working on decision tree learner algorithm to detect fraudulent bank transactions.
So far,I have generated rule set for decision tree based on my data-set.
I have also generated ...
0
votes
1
answer
912
views
Negative value of Information Gain
I'm implementing C4.5 and in my calculations im getting (for some examples) negative values for information gain. I read Why am I getting a negative information gain, but my issiue seeams to be ...
1
vote
2
answers
1k
views
Decision tree algorithm for mixed numeric and nominal data
my dataset contains a number of numeric and categorical attributes
example: numericAttr1, numericAttr2, categoricalAttr1, numericalAttr3... where categoricalAttr values: categoricalAttrValue1, ...
3
votes
0
answers
2k
views
Implementing Pseudocode For ID3 Algorithm
I'm trying to implement the pseudo code for the id3 algorithm that is given below
function ID3 (I, 0, T) {
/* I is the set of input attributes
* O is the output attribute
* T is a set of ...
2
votes
1
answer
405
views
Machine learning method which is able to integrate prior knowledge in a decision tree
Does any of you know a machine learning method or combination of methods which makes it possible to integrate prior knowledge in the building process of a decision tree?
With "prior knowledge" I mean ...