o

dynamics.etl

sources

object sources

Linear Supertypes
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. sources
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. type EventRegistration[F[_], A] = (QueueForStream[F, A]) ⇒ (UndefOr[A]) ⇒ Unit

Value Members

  1. def CSVFileSource(file: String, csvParserOptions: ParserOptions = DefaultCSVParserOptions)(implicit ec: ExecutionContext): Stream[IO, Object]

    Create a Stream[IO, js.Object] from a file name and CSV options using csv-parse.

    Create a Stream[IO, js.Object] from a file name and CSV options using csv-parse. Run start to start the OS reading. The underlying CSV file is automatically opened and closed.

  2. def CSVFileSourceBuffer_(file: String, csvParserOptions: ParserOptions = DefaultCSVParserOptions)(implicit e: ExecutionContext): IO[Iterable[Object]]
  3. def CSVFileSource_(file: String, csvParserOptions: ParserOptions, f: (Readable) ⇒ Stream[IO, Object])(implicit ec: ExecutionContext): Stream[IO, Object]
  4. val DefaultCSVParserOptions: ParserOptions
  5. def FileLines(file: String)(implicit ec: ExecutionContext): Stream[IO, String]

    Read a file (utf8) line by line.

  6. def JSONFileSource[A](file: String, isArray: Boolean = false)(implicit ec: ExecutionContext): Stream[IO, A]

    Stream a file containing JSON objects separated by newlines.

    Stream a file containing JSON objects separated by newlines. Newlines should be escaped inside JSON values. isArray = true implies that json objects inside are wrapped in an array and hence have commas separating the objects. When streaming, the data structure returned has an index and a value field with the actual data. See StreamValue.

  7. def MSSQLSource[A](qstr: String, config: |[|[Dynamic, RawOptions], String], qsize: Int = 10000)(implicit ec: ExecutionContext): Stream[IO, A]

    A MSSQL source that executes a query.

    A MSSQL source that executes a query. If you create your own coonnection pool then use queryToStream once you create your query request.

    To do

    Make non-native trait for the connection config options

    See also

    https://www.npmjs.com/package/mssql#tedious connection string info.

  8. def MSSQLSourceRequest[A](qstr: String, config: |[|[Dynamic, RawOptions], String])(implicit ec: ExecutionContext): IO[Request]
  9. def jsToFs2Stream[F[_], A](events: Seq[(String, EventRegistration[F, A])], source: IEventEmitter, qsize: Int = 1000)(implicit ec: ExecutionContext, F: Effect[F]): Stream[F, A]

    Convert a readable to a fs2 Stream given a specific set of events and their handlers.

    Convert a readable to a fs2 Stream given a specific set of events and their handlers. Each handler can signal the end of the stream by enqueuing None or signal an error by returning a Left. Handler return results via qparam.enqueue1(Some(Right(data))).unsafeRunAsync(_ => ()) or something similar. If your callbacks all have different types, you may have to type out your own conversion or cast.

  10. def queryToStream[F[_], A](query: Request, qsize: Int = 10000)(implicit ec: ExecutionContext, F: Effect[F]): Stream[F, A]

    Assuming the query has been set to streaming, stream the results.

  11. def readableToStream[F[_], A](readable: Readable, qsize: Int = 1000)(implicit ec: ExecutionContext, F: Effect[F]): Stream[F, A]

    Turn a Readable parser into a Stream of js.Object using the callback onData.

    Turn a Readable parser into a Stream of js.Object using the callback onData. While A could be a Buffer or String, it could also be a js.Object. A must reflect the callers understanding of what objects the Readable will produce. You could use this over readable.iterator => fs2.Stream when you want to bubble up errors explicitly through the fs2 layer. If you have other event names that are used for the callbacks, use readableToStreamWithEvents.

  12. def toInput[A](startIndex: Int = 1): Pipe[IO, A, InputContext[A]]

    Add a source string based on the record index position.