You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The length lex() function currently taking is 2400 observation/rows. I have tested this before, it was working well with less than 2400 as well as up to 2400, which was fair enough for me. However, when I ran lex() function yesterday, it returned an error when less than 2400 obs were used, meaning that the length must be 2400 rows, not less!
Q: Is there a way to make it flexible, any 60 multiples مضاعفات ٦٠?
For the testing-data, the following codes generate fake-data (also available in the Lextale readme file, under examples):
answer <- sample(c(0/1), replace = TRUE, 2400) #generate 2400 random binary responses
ID <- gl(40, 60) #generate 40 ids
test-data <- cbind(ID, answer) #combine the two columns above into one data
as.data.frame(test-data)
lex() function codes, I need to include 'if statements', i.e. if users used yes/no instead of binary data (0/1), the function should convert yes to 1 and no to 0. For now the function lex() only takes binary 0/1.
Additionally, but not necessary, I'd like to ensure that ID and answer lengths be equal, e.g. if there are 60 observations under ID there should be 60 observations under answer, and if not, a warning message should tell the user something in line with ' ID and answer lengths must be equal' or 'make sure there are no NAs in your data'
The text was updated successfully, but these errors were encountered:
Hi @BatoolMM
Q: Is there a way to make it flexible, any 60 multiples مضاعفات ٦٠?
For the testing-data, the following codes generate fake-data (also available in the Lextale readme file, under examples):
answer <- sample(c(0/1), replace = TRUE, 2400) #generate 2400 random binary responses
ID <- gl(40, 60) #generate 40 ids
test-data <- cbind(ID, answer) #combine the two columns above into one data
as.data.frame(test-data)
The text was updated successfully, but these errors were encountered: