Packages

c

org.apache.spark.sql.hive.execution

InsertIntoHiveTable

case class InsertIntoHiveTable(table: CatalogTable, partition: Map[String, Option[String]], query: LogicalPlan, overwrite: Boolean, ifPartitionNotExists: Boolean, outputColumnNames: Seq[String]) extends LogicalPlan with SaveAsHiveFile with Product with Serializable

Command for writing data out to a Hive table.

This class is mostly a mess, for legacy reasons (since it evolved in organic ways and had to follow Hive's internal implementations closely, which itself was a mess too). Please don't blame Reynold for this! He was just moving code around!

In the future we should converge the write path for Hive with the normal data source write path, as defined in org.apache.spark.sql.execution.datasources.FileFormatWriter.

table

the metadata of the table.

partition

a map from the partition key to the partition value (optional). If the partition value is optional, dynamic partition insert will be performed. As an example, INSERT INTO tbl PARTITION (a=1, b=2) AS ... would have

Map('a' -> Some('1'), 'b' -> Some('2'))

and INSERT INTO tbl PARTITION (a=1, b) AS ... would have

Map('a' -> Some('1'), 'b' -> None)

.

query

the logical plan representing data to write to.

overwrite

overwrite existing table or partitions.

ifPartitionNotExists

If true, only write if the partition does not exist. Only valid for static partitions.

Linear Supertypes
Serializable, Serializable, SaveAsHiveFile, DataWritingCommand, Command, LogicalPlan, Logging, QueryPlanConstraints, ConstraintHelper, LogicalPlanStats, AnalysisHelper, QueryPlan[LogicalPlan], TreeNode[LogicalPlan], Product, Equals, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. InsertIntoHiveTable
  2. Serializable
  3. Serializable
  4. SaveAsHiveFile
  5. DataWritingCommand
  6. Command
  7. LogicalPlan
  8. Logging
  9. QueryPlanConstraints
  10. ConstraintHelper
  11. LogicalPlanStats
  12. AnalysisHelper
  13. QueryPlan
  14. TreeNode
  15. Product
  16. Equals
  17. AnyRef
  18. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new InsertIntoHiveTable(table: CatalogTable, partition: Map[String, Option[String]], query: LogicalPlan, overwrite: Boolean, ifPartitionNotExists: Boolean, outputColumnNames: Seq[String])

    table

    the metadata of the table.

    partition

    a map from the partition key to the partition value (optional). If the partition value is optional, dynamic partition insert will be performed. As an example, INSERT INTO tbl PARTITION (a=1, b=2) AS ... would have

    Map('a' -> Some('1'), 'b' -> Some('2'))

    and INSERT INTO tbl PARTITION (a=1, b) AS ... would have

    Map('a' -> Some('1'), 'b' -> None)

    .

    query

    the logical plan representing data to write to.

    overwrite

    overwrite existing table or partitions.

    ifPartitionNotExists

    If true, only write if the partition does not exist. Only valid for static partitions.

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. lazy val allAttributes: AttributeSeq
    Definition Classes
    QueryPlan
  5. def analyzed: Boolean
    Definition Classes
    AnalysisHelper
  6. def apply(number: Int): TreeNode[_]
    Definition Classes
    TreeNode
  7. def argString: String
    Definition Classes
    TreeNode
  8. def asCode: String
    Definition Classes
    TreeNode
  9. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  10. def assertNotAnalysisRule(): Unit
    Attributes
    protected
    Definition Classes
    AnalysisHelper
  11. def basicWriteJobStatsTracker(hadoopConf: Configuration): BasicWriteJobStatsTracker
    Definition Classes
    DataWritingCommand
  12. final lazy val canonicalized: LogicalPlan
    Definition Classes
    QueryPlan
    Annotations
    @transient()
  13. final def children: Seq[LogicalPlan]
    Definition Classes
    DataWritingCommand → Command → TreeNode
  14. def childrenResolved: Boolean
    Definition Classes
    LogicalPlan
  15. def clone(): AnyRef
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  16. def collect[B](pf: PartialFunction[LogicalPlan, B]): Seq[B]
    Definition Classes
    TreeNode
  17. def collectFirst[B](pf: PartialFunction[LogicalPlan, B]): Option[B]
    Definition Classes
    TreeNode
  18. def collectLeaves(): Seq[LogicalPlan]
    Definition Classes
    TreeNode
  19. def conf: SQLConf
    Definition Classes
    QueryPlan
  20. lazy val constraints: ExpressionSet
    Definition Classes
    QueryPlanConstraints
  21. def constructIsNotNullConstraints(constraints: Set[Expression], output: Seq[Attribute]): Set[Expression]
    Definition Classes
    ConstraintHelper
  22. lazy val containsChild: Set[TreeNode[_]]
    Definition Classes
    TreeNode
  23. val createdTempDir: Option[Path]
    Definition Classes
    SaveAsHiveFile
  24. def deleteExternalTmpPath(hadoopConf: Configuration): Unit
    Attributes
    protected
    Definition Classes
    SaveAsHiveFile
  25. def doCanonicalize(): LogicalPlan
    Attributes
    protected
    Definition Classes
    QueryPlan
  26. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  27. final def expressions: Seq[Expression]
    Definition Classes
    QueryPlan
  28. def fastEquals(other: TreeNode[_]): Boolean
    Definition Classes
    TreeNode
  29. def finalize(): Unit
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  30. def find(f: (LogicalPlan) ⇒ Boolean): Option[LogicalPlan]
    Definition Classes
    TreeNode
  31. def flatMap[A](f: (LogicalPlan) ⇒ TraversableOnce[A]): Seq[A]
    Definition Classes
    TreeNode
  32. def foreach(f: (LogicalPlan) ⇒ Unit): Unit
    Definition Classes
    TreeNode
  33. def foreachUp(f: (LogicalPlan) ⇒ Unit): Unit
    Definition Classes
    TreeNode
  34. def generateTreeString(depth: Int, lastChildren: Seq[Boolean], writer: Writer, verbose: Boolean, prefix: String, addSuffix: Boolean): Unit
    Definition Classes
    TreeNode
  35. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  36. def getExternalTmpPath(sparkSession: SparkSession, hadoopConf: Configuration, path: Path): Path
    Attributes
    protected
    Definition Classes
    SaveAsHiveFile
  37. def hashCode(): Int
    Definition Classes
    TreeNode → AnyRef → Any
  38. val ifPartitionNotExists: Boolean
  39. def inferAdditionalConstraints(constraints: Set[Expression]): Set[Expression]
    Definition Classes
    ConstraintHelper
  40. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean = false): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  41. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  42. def innerChildren: Seq[QueryPlan[_]]
    Attributes
    protected
    Definition Classes
    QueryPlan → TreeNode
  43. def inputSet: AttributeSet
    Definition Classes
    QueryPlan
  44. final def invalidateStatsCache(): Unit
    Definition Classes
    LogicalPlanStats
  45. def isCanonicalizedPlan: Boolean
    Attributes
    protected
    Definition Classes
    QueryPlan
  46. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  47. def isStreaming: Boolean
    Definition Classes
    LogicalPlan
  48. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  49. def jsonFields: List[JField]
    Attributes
    protected
    Definition Classes
    TreeNode
  50. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  51. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  52. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  53. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  54. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  55. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  56. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  57. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  58. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  59. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  60. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  61. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  62. def makeCopy(newArgs: Array[AnyRef]): LogicalPlan
    Definition Classes
    TreeNode
  63. def map[A](f: (LogicalPlan) ⇒ A): Seq[A]
    Definition Classes
    TreeNode
  64. def mapChildren(f: (LogicalPlan) ⇒ LogicalPlan): LogicalPlan
    Definition Classes
    TreeNode
  65. def mapExpressions(f: (Expression) ⇒ Expression): InsertIntoHiveTable.this.type
    Definition Classes
    QueryPlan
  66. def mapProductIterator[B](f: (Any) ⇒ B)(implicit arg0: ClassTag[B]): Array[B]
    Attributes
    protected
    Definition Classes
    TreeNode
  67. def maxRows: Option[Long]
    Definition Classes
    LogicalPlan
  68. def maxRowsPerPartition: Option[Long]
    Definition Classes
    LogicalPlan
  69. lazy val metrics: Map[String, SQLMetric]
    Definition Classes
    DataWritingCommand
  70. def missingInput: AttributeSet
    Definition Classes
    QueryPlan
  71. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  72. def nodeName: String
    Definition Classes
    TreeNode
  73. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  74. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  75. def numberedTreeString: String
    Definition Classes
    TreeNode
  76. val origin: Origin
    Definition Classes
    TreeNode
  77. def otherCopyArgs: Seq[AnyRef]
    Attributes
    protected
    Definition Classes
    TreeNode
  78. def output: Seq[Attribute]
    Definition Classes
    Command → QueryPlan
  79. val outputColumnNames: Seq[String]
    Definition Classes
    InsertIntoHiveTable → DataWritingCommand
  80. def outputColumns: Seq[Attribute]
    Definition Classes
    DataWritingCommand
  81. def outputOrdering: Seq[SortOrder]
    Definition Classes
    LogicalPlan
  82. def outputSet: AttributeSet
    Definition Classes
    QueryPlan
  83. val overwrite: Boolean
  84. def p(number: Int): LogicalPlan
    Definition Classes
    TreeNode
  85. val partition: Map[String, Option[String]]
  86. def prettyJson: String
    Definition Classes
    TreeNode
  87. def printSchema(): Unit
    Definition Classes
    QueryPlan
  88. def producedAttributes: AttributeSet
    Definition Classes
    QueryPlan
  89. val query: LogicalPlan
    Definition Classes
    InsertIntoHiveTable → DataWritingCommand
  90. def references: AttributeSet
    Definition Classes
    QueryPlan
  91. def refresh(): Unit
    Definition Classes
    LogicalPlan
  92. def resolve(nameParts: Seq[String], resolver: Resolver): Option[NamedExpression]
    Definition Classes
    LogicalPlan
  93. def resolve(schema: StructType, resolver: Resolver): Seq[Attribute]
    Definition Classes
    LogicalPlan
  94. def resolveChildren(nameParts: Seq[String], resolver: Resolver): Option[NamedExpression]
    Definition Classes
    LogicalPlan
  95. def resolveExpressions(r: PartialFunction[Expression, Expression]): LogicalPlan
    Definition Classes
    AnalysisHelper
  96. def resolveOperators(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  97. def resolveOperatorsDown(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  98. def resolveOperatorsUp(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper
  99. def resolveQuoted(name: String, resolver: Resolver): Option[NamedExpression]
    Definition Classes
    LogicalPlan
  100. lazy val resolved: Boolean
    Definition Classes
    LogicalPlan
  101. def run(sparkSession: SparkSession, child: SparkPlan): Seq[Row]

    Inserts all the rows in the table into Hive.

    Inserts all the rows in the table into Hive. Row objects are properly serialized with the org.apache.hadoop.hive.serde2.SerDe and the org.apache.hadoop.mapred.OutputFormat provided by the table definition.

    Definition Classes
    InsertIntoHiveTable → DataWritingCommand
  102. def sameOutput(other: LogicalPlan): Boolean
    Definition Classes
    LogicalPlan
  103. final def sameResult(other: LogicalPlan): Boolean
    Definition Classes
    QueryPlan
  104. def saveAsHiveFile(sparkSession: SparkSession, plan: SparkPlan, hadoopConf: Configuration, fileSinkConf: ShimFileSinkDesc, outputLocation: String, customPartitionLocations: Map[TablePartitionSpec, String] = Map.empty, partitionAttributes: Seq[Attribute] = Nil): Set[String]
    Attributes
    protected
    Definition Classes
    SaveAsHiveFile
  105. lazy val schema: StructType
    Definition Classes
    QueryPlan
  106. def schemaString: String
    Definition Classes
    QueryPlan
  107. final def semanticHash(): Int
    Definition Classes
    QueryPlan
  108. def simpleString: String
    Definition Classes
    QueryPlan → TreeNode
  109. def statePrefix: String
    Attributes
    protected
    Definition Classes
    LogicalPlan → QueryPlan
  110. def stats: Statistics
    Definition Classes
    LogicalPlanStats
  111. val statsCache: Option[Statistics]
    Attributes
    protected
    Definition Classes
    LogicalPlanStats
  112. def stringArgs: Iterator[Any]
    Attributes
    protected
    Definition Classes
    TreeNode
  113. def subqueries: Seq[LogicalPlan]
    Definition Classes
    QueryPlan
  114. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  115. val table: CatalogTable
  116. def toJSON: String
    Definition Classes
    TreeNode
  117. def toString(): String
    Definition Classes
    TreeNode → AnyRef → Any
  118. def transform(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode
  119. def transformAllExpressions(rule: PartialFunction[Expression, Expression]): InsertIntoHiveTable.this.type
    Definition Classes
    AnalysisHelper → QueryPlan
  120. def transformDown(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper → TreeNode
  121. def transformExpressions(rule: PartialFunction[Expression, Expression]): InsertIntoHiveTable.this.type
    Definition Classes
    QueryPlan
  122. def transformExpressionsDown(rule: PartialFunction[Expression, Expression]): InsertIntoHiveTable.this.type
    Definition Classes
    QueryPlan
  123. def transformExpressionsUp(rule: PartialFunction[Expression, Expression]): InsertIntoHiveTable.this.type
    Definition Classes
    QueryPlan
  124. def transformUp(rule: PartialFunction[LogicalPlan, LogicalPlan]): LogicalPlan
    Definition Classes
    AnalysisHelper → TreeNode
  125. def treeString(writer: Writer, verbose: Boolean, addSuffix: Boolean): Unit
    Definition Classes
    TreeNode
  126. def treeString(verbose: Boolean, addSuffix: Boolean): String
    Definition Classes
    TreeNode
  127. def treeString: String
    Definition Classes
    TreeNode
  128. def validConstraints: Set[Expression]
    Attributes
    protected
    Definition Classes
    QueryPlanConstraints
  129. def verboseString: String
    Definition Classes
    QueryPlan → TreeNode
  130. def verboseStringWithSuffix: String
    Definition Classes
    LogicalPlan → TreeNode
  131. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  132. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  133. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @native() @throws( ... )
  134. def withNewChildren(newChildren: Seq[LogicalPlan]): LogicalPlan
    Definition Classes
    TreeNode

Inherited from Serializable

Inherited from Serializable

Inherited from SaveAsHiveFile

Inherited from DataWritingCommand

Inherited from Command

Inherited from LogicalPlan

Inherited from Logging

Inherited from QueryPlanConstraints

Inherited from ConstraintHelper

Inherited from LogicalPlanStats

Inherited from AnalysisHelper

Inherited from QueryPlan[LogicalPlan]

Inherited from TreeNode[LogicalPlan]

Inherited from Product

Inherited from Equals

Inherited from AnyRef

Inherited from Any

Ungrouped