-
Notifications
You must be signed in to change notification settings - Fork 138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add dayofyear, weekofyear, month, dayofmonth, minute, second, next_da… #268
base: master
Are you sure you want to change the base?
Conversation
* | ||
* apache/spark | ||
*/ | ||
def next_day[T](date: AbstractTypedColumn[T, String], dayOfWeek: String): date.ThisType[T, Option[java.sql.Date]] = |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right now it doesn't compile.
In Spark it returns java.sql.Date
, I'm not sure whether I should add TypedEncoder
for that or use something else.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't see any issue with an encoder for java.sql.Date
, if this is what's return in vanilla we can simply follow.
@Avasil Sorry for taking so long to have a look at your PR... The diff looks pretty good at a quick glance, do you need help on anything to get the CI green before the review? |
Thanks and no problem @OlivierBlanvillain I was a bit busy lately too - I think I'm good, If I have any issues with |
@OlivierBlanvillain Hmm, now it fails the test related to #205: test("#205: comparing literals encoded using Injection") {
import org.apache.spark.sql.catalyst.util.DateTimeUtils
implicit val dateAsInt: Injection[java.sql.Date, Int] =
Injection(DateTimeUtils.fromJavaDate, DateTimeUtils.toJavaDate)
val today = new java.sql.Date(System.currentTimeMillis)
val data = Vector(P(42, today))
val tds = TypedDataset.create(data)
tds.filter(tds('d) === today).collect().run()
}
}
final case class P(i: Int, d: java.sql.Date) I'ts failing with:
Any tips how to debug stuff like that? I need to somehow figure out why Spark tries to generate code this way. :D |
# Conflicts: # dataset/src/main/scala/frameless/functions/NonAggregateFunctions.scala # dataset/src/test/scala/frameless/functions/NonAggregateFunctionsTests.scala
@OlivierBlanvillain @imarios Any ideas how to proceed? :) IIRC many other column functions could use |
…y Column functions
Related to #164