logo
down
shadow

How to generate Path with DateTimeFormat for pattern yyyy/mm/dd/HH


How to generate Path with DateTimeFormat for pattern yyyy/mm/dd/HH

By : xianyu
Date : November 21 2020, 11:01 PM
hope this fix your issue You have used LocalDate which is a date-only class, it explicitly does not contain time information (this is different to java.sql.Date which contains time and date info). Therefore Joda cannot render the "HH" as hour, as it does not have that info.
Try instead:
code :
val startDate = "2016-09-25T04:00:00Z"

val endDate = "2016-10-23T04:00:00Z"

val s3Bucket = "s3://test_bucket/"

def getUtilDate(timestamp: String): org.joda.time.DateTime =
  new DateTime(timestamp, DateTimeZone.UTC)

val start = getUtilDate(startDate)

val end = getUtilDate(endDate)

val days: Int = Days.daysBetween(start, end).getDays

val files: Seq[String] = (0 to days)
  .map(start.plusDays)
  .map(d => s"$s3Bucket${DateTimeFormat.forPattern("yyyy/MM/dd/HH").print(d)}/*")

println(files)
val startDate = "2016-09-25T04:00:00Z"
val endDate = "2016-10-23T04:00:00Z"

val s3Bucket = "s3://test_bucket/"

def getUtilDate(timestamp: String): org.joda.time.DateTime =
  new DateTime(timestamp, DateTimeZone.UTC)

val start = getUtilDate(startDate)
val end = getUtilDate(endDate)

val fmt = DateTimeFormat.forPattern("yyyy/MM/dd/HH")
def bucketName(date: DateTime): String = s"$s3Bucket${fmt.print(date)}"

{
  // Imperative style:
  var t = start
  val files = mutable.Buffer[String]()
  do {
    files += bucketName(t)
    t = t.plusHours(1)
  } while (t.compareTo(end) < 0)

  println(files)
}

{
  // Functional style:
  @tailrec
  def loop(t: DateTime, acc: Seq[String]): Seq[String] = t match {
    case `end` => acc
    case _ =>
      loop(
        t.plusHours(1),
        acc :+ bucketName(t))
  }

  val files = loop(start, Vector())

  println(files)
}


Share : facebook icon twitter icon
The best way to generate path pattern for materialized path tree structures

The best way to generate path pattern for materialized path tree structures


By : Prasad Prasad
Date : March 29 2020, 07:55 AM
help you fix your problem You are correct -- zero-padding each node ID would allow you to sort the entire tree quite simply. However, you have to make the padding width match the upper limit of digits of the ID field, as you have pointed out in your last example. E.g., if you're using an int unsigned field for your ID, the highest value would be 4,294,967,295. This is ten digits, meaning that the record set from your last example might look like:
code :
uid   | name | tree_id
9999  | Tar  | 0000000001.0000009999
10000 | Tor  | 0000000001.0000010000
uid   | name | tree_id
9999  | Tar  | 00000001.0000270F
10000 | Tor  | 00000001.00002710
uid   | name | tree_id           | name_sort
9999  | Tar  | 00000001.0000270F | Ali.Tar
10000 | Tor  | 00000001.00002710 | Ali.Tor
Failed to convert string in java.util.Date with @DateTimeFormat(pattern="dd/MM/yyyy")

Failed to convert string in java.util.Date with @DateTimeFormat(pattern="dd/MM/yyyy")


By : Gurbax
Date : March 29 2020, 07:55 AM
around this issue I have a simple POJO with a Date field. I want to bind the object with values from a form. , Your date format does not match input
code :
@DateTimeFormat(pattern = "dd/MM/yyyy")
@DateTimeFormat(pattern = "dd-MM-yyyy")
Pattern Matches for Date Format: dd.MM.yyyy, MM.yyyy and yyyy

Pattern Matches for Date Format: dd.MM.yyyy, MM.yyyy and yyyy


By : Shoichi
Date : March 29 2020, 07:55 AM
This might help you I need to check a date for a given string. The string that I get isn't regular and I don't want to use Dateformat or something like this. I'm trying to avoid getting multiple exceptions. I need a regex fordd.MM.yyyy and MM.yyyy and yyyy. At first I had , If you really want to use a regex, here is a basic working example:
code :
String regex = "([0-9]{2}\\.){0,2}([0-9]{4})";
assert "03.2017".matches(regex);
assert "31.03.2017".matches(regex);
assert "2017".matches(regex);
assert !"23-2017".matches(regex);
DateTimeFormatter formatter = DateTimeFormatter.ofPattern("[[dd.]MM.]yyyy");
// correct dates
assert formatter.parse("31.12.2017") != null;
assert formatter.parse("12.2017") != null;
assert formatter.parse("2017") != null;
// wrong date
assert formatter.parse("31.2017") == null;
String[] acceptedFormats = {"dd.MM.yyyy", "dd.MM.yyyy", "dd/MM/yyyy"};

// correct dates
assert DateUtils.parseDate("07.12.2017", acceptedFormats) != null;
assert DateUtils.parseDate("07.2017", acceptedFormats) != null;
assert DateUtils.parseDate("2017", acceptedFormats) != null;
assert DateUtils.parseDate("2017", acceptedFormats) != null;
// wrong dates
assert DateUtils.parseDate("123.2012", acceptedFormats) == null;
assert DateUtils.parseDate("01.13.2012", acceptedFormats) == null;
regular expression for datetimeformat (yyyy-mm-ddThh:mm:ss) ISO8601 format

regular expression for datetimeformat (yyyy-mm-ddThh:mm:ss) ISO8601 format


By : Hamza sadiqi
Date : March 29 2020, 07:55 AM
wish helps you The regex in Adam Yost's answer is close, but missing a closing bracket before the T... don't have enough rep to comment so here is the corrected version:
@DateTimeFormat(pattern="yyyy-MM-dd") with Spring Mvc Rest Service gives "error 400 request syntactically

@DateTimeFormat(pattern="yyyy-MM-dd") with Spring Mvc Rest Service gives "error 400 request syntactically


By : esamota
Date : March 29 2020, 07:55 AM
it fixes the issue In the spring forum they said that you need to have a conversion service initialized in your configuration in order to use the Formatter automátically. Something like:
Related Posts Related Posts :
  • Spark SQL - IN clause
  • How to get rid of nested future in scala?
  • SBT, resolving modules were resolved with conflicting cross-version suffixes in
  • Optimizing the Spark Code
  • Is this a bug of scala's specialized?
  • Try and getOrElse scala
  • Can I use Enumerations as type parameters in Scala?
  • In Scaldi, I loaded my own typesafe config, how can I set Scaldi to make it available?
  • How to chain operations in idiomatic scala
  • Function with Generic type that extend class
  • When to create an Akka Actor
  • Scala append Seq in with a string in a List Map or Foreach
  • Accessing a Scala Trait method inside another method
  • Scala: force type parameter to be a case class
  • Can non-persistent data structures be used in a purely functional way?
  • Shuffle some sequence of List in Scala
  • GraphX-Spark: error graph.vertices.filter
  • Initialize several objects at once
  • How to do Multiple column count in SPARK/SCALA efficiently?
  • Asynchronous IO (socket) in Scala
  • Scala and Hive: best way to write a generic method that works with all types of Writable
  • Unable to make reference to object in build.sbt, "error: not found: value hooks"
  • Process large text file using Zeppelin and Spark
  • value toArray is not a member of org.apache.spark.rdd.RDD[(String, Int)]
  • What is wrong with the method definition
  • Make CRUD operations with ReactiveMongo
  • Spark unable to find "spark-version-info.properties" when run from ammonite script
  • Intersection of Two Map rdd's in Scala
  • Spark - Scala : Return multiple <key, value> after processing one line
  • Adding Scala Concurrent Duration To a DateTime
  • Why those two stages in apache spark are computing same thing?
  • Using tuple as a key in scala
  • writing SparkRDD to a HBase table using Scala
  • Could someone explain how is the this code executing?
  • Datastax spark cassandra connector - writing DF to cassandra table
  • bigquery Repeated record added outside of an array
  • Scala infinite iterate function
  • How to include test dependencies in sbt-assembly jar?
  • Does scala.collection.Seq.groupBy() function preserve the order?
  • Akka Streams - How to keep materialized value of an auxiliary Sink in a Graph
  • How does circe parse a generic type object to Json?
  • What is the combination of flatMap and filter?
  • Task not Serializable error:Spark
  • Scala subtype parameter
  • Scala and dropWhile
  • Scala Recursive Function with Future Return Type
  • Sequences in Spark dataframe
  • Type mismatch for function composition
  • Error while trying to Iterate over List of Strings
  • What does "case class extends trait" mean in Scala?
  • Any string concatenation method in slick groupBy?
  • How to count the number of occurences of an element with scala/spark?
  • How to build a parse tree of simple expressions
  • Spark filter and count big RDD multiple times
  • Why can't concrete members be overridden with abstract ones in scala?
  • String + StringOps = functor?
  • Akka remote: is it possible to work with classes from two differents app which don't have the same package name?
  • How to perform merge operation on spark Dataframe?
  • Converting command line argument key=value pair to Map in scala
  • Implement Actor model without Akka in Scala
  • shadow
    Privacy Policy - Terms - Contact Us © soohba.com