Как разрешить исключения в Scala с помощью IntelliJ?

Я пытаюсь запустить этот проект, я добавил зависимость в файл sbt, мой файл sbt выглядит так:

name := "HelloScala"
version := "0.1"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.1"
resolvers += Resolver.bintrayRepo("salesforce", "maven")
libraryDependencies += "com.salesforce.transmogrifai" %% "transmogrifai-core" % "0.3.4"

Затем я скопировал папку Helloworld из их репозитория, но возникло много проблем.

Information:10/09/18, 12:01 PM - Compilation completed with 88 errors and 0 warnings in 15 s 624 ms
Error:scalac: missing or invalid dependency detected while loading class file 'package.class'.
Could not access type Vector in value org.apache.spark.ml.linalg,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'package.class' was compiled against an incompatible version of org.apache.spark.ml.linalg.
Error:scalac: missing or invalid dependency detected while loading class file 'OPVector.class'.
Could not access type Vector in value org.apache.spark.ml.linalg,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OPVector.class' was compiled against an incompatible version of org.apache.spark.ml.linalg.
Error:scalac: missing or invalid dependency detected while loading class file 'OpEvaluatorBase.class'.
Could not access type Evaluator in value org.apache.spark.ml.evaluation,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpEvaluatorBase.class' was compiled against an incompatible version of org.apache.spark.ml.evaluation.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasLabelCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasLabelCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasPredictionCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasPredictionCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasFullPredictionCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasFullPredictionCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasRawPredictionCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasRawPredictionCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpHasProbabilityCol.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpHasProbabilityCol.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'ClassificationModelSelector.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'ClassificationModelSelector.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'InputParams.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'InputParams.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'OpPipelineStageBase.class'.
Could not access type MLWritable in value org.apache.spark.ml.util,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpPipelineStageBase.class' was compiled against an incompatible version of org.apache.spark.ml.util.
Error:scalac: missing or invalid dependency detected while loading class file 'HasLogisticRegression.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasLogisticRegression.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasRandomForestBase.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasRandomForestBase.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasDecisionTreeBase.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasDecisionTreeBase.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasNaiveBayes.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasNaiveBayes.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'DataReaders.class'.
Could not access type Encoder in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'DataReaders.class' was compiled against an incompatible version of org.apache.spark.sql.
Error:scalac: missing or invalid dependency detected while loading class file 'OpWorkflow.class'.
Could not access type SparkSession in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpWorkflow.class' was compiled against an incompatible version of org.apache.spark.sql.
Error:scalac: missing or invalid dependency detected while loading class file 'SplitterParams.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'SplitterParams.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'ModelSelectorBase.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'ModelSelectorBase.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'HasLinearRegression.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasLinearRegression.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasGradientBoostedTreeBase.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasGradientBoostedTreeBase.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasRandomForestBase.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasRandomForestBase.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'DataCutterParams.class'.
Could not access type Params in value org.apache.spark.ml.param,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'DataCutterParams.class' was compiled against an incompatible version of org.apache.spark.ml.param.
Error:scalac: missing or invalid dependency detected while loading class file 'HasDecisionTreeBase.class'.
Could not access type Estimator in package org.apache.spark.ml,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'HasDecisionTreeBase.class' was compiled against an incompatible version of org.apache.spark.ml.
Error:scalac: missing or invalid dependency detected while loading class file 'FeatureBuilder.class'.
Could not access term package in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'FeatureBuilder.class' was compiled against an incompatible version of org.apache.spark.sql.
Error:scalac: missing or invalid dependency detected while loading class file 'FeatureBuilder.class'.
Could not access type DataFrame in value org.apache.spark.sql.package,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'FeatureBuilder.class' was compiled against an incompatible version of org.apache.spark.sql.package.
Error:scalac: missing or invalid dependency detected while loading class file 'OpWorkflowCore.class'.
Could not access type Dataset in package org.apache.spark.sql,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'OpWorkflowCore.class' was compiled against an incompatible version of org.apache.spark.sql.
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/OpTitanicSimple.scala
Error:(42, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(95, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(conf).getOrCreate()
Error:(143, 8) overloaded method value setLabelCol with alternatives:
  (value: com.salesforce.op.features.FeatureLike[T])OpHasLabelCol.this.type <and>
  (value: String)OpHasLabelCol.this.type
 cannot be applied to (com.salesforce.op.features.Feature[com.salesforce.op.features.types.RealNN])
      .setLabelCol(survived)
Error:(154, 64) could not find implicit value for evidence parameter of type org.apache.spark.sql.Encoder[com.salesforce.hw.Passenger]
    val trainDataReader = DataReaders.Simple.csvCase[Passenger](
Error:(166, 40) could not find implicit value for parameter spark: org.apache.spark.sql.SparkSession
    val fittedWorkflow = workflow.train()
Error:(174, 15) value columns is not a member of Any
    dataframe.columns.foreach(println)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/boston/OpBoston.scala
Error:(41, 8) object Dataset is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(41, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(56, 47) not found: type SparkSession
  def customRead(path: Option[String], spark: SparkSession): RDD[BostonHouse] = {
Error:(69, 90) not found: type Dataset
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(69, 50) not found: type SparkSession
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(77, 90) not found: type Dataset
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(77, 50) not found: type SparkSession
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[BostonHouse], Dataset[BostonHouse]] = {
Error:(94, 6) value setGradientBoostedTreeSeed is not a member of com.salesforce.op.stages.impl.selector.HasRandomForestBase[E,MS]
possible cause: maybe a semicolon is missing before `value setGradientBoostedTreeSeed'?
    .setGradientBoostedTreeSeed(randomSeed)
Error:(100, 43) overloaded method value setLabelCol with alternatives:
  (value: com.salesforce.op.features.FeatureLike[T])OpHasLabelCol.this.type <and>
  (value: String)OpHasLabelCol.this.type
 cannot be applied to (com.salesforce.op.features.Feature[com.salesforce.op.features.types.RealNN])
  val evaluator = Evaluators.Regression().setLabelCol(medv).setPredictionCol(prediction)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/dataprep/ConditionalAggregation.scala
Error:(40, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(69, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(conf).getOrCreate()
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/dataprep/JoinsAndAggregates.scala
Error:(40, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(74, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(conf).getOrCreate()
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/iris/IrisFeatures.scala
Error:(38, 36) not found: type Iris
  val id = FeatureBuilder.Integral[Iris].extract(_.getID.toIntegral).asPredictor
Error:(39, 41) not found: type Iris
  val sepalLength = FeatureBuilder.Real[Iris].extract(_.getSepalLength.toReal).asPredictor
Error:(40, 40) not found: type Iris
  val sepalWidth = FeatureBuilder.Real[Iris].extract(_.getSepalWidth.toReal).asPredictor
Error:(41, 41) not found: type Iris
  val petalLength = FeatureBuilder.Real[Iris].extract(_.getPetalLength.toReal).asPredictor
Error:(42, 40) not found: type Iris
  val petalWidth = FeatureBuilder.Real[Iris].extract(_.getPetalWidth.toReal).asPredictor
Error:(43, 39) not found: type Iris
  val irisClass = FeatureBuilder.Text[Iris].extract(_.getClass$.toText).asResponse
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/iris/IrisKryoRegistrator.scala
Error:(40, 47) type Iris is not a member of package com.salesforce.hw.iris
    doAvroRegistration[com.salesforce.hw.iris.Iris](kryo)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/iris/OpIris.scala
Error:(41, 8) object Dataset is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(41, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.{Dataset, SparkSession}
Error:(56, 37) not found: type Iris
  val irisReader = new CustomReader[Iris](key = _.getID.toString){
Error:(57, 76) not found: type Iris
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[Iris], Dataset[Iris]] = {
Error:(57, 83) not found: type Dataset
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[Iris], Dataset[Iris]] = {
Error:(57, 50) not found: type SparkSession
    def readFn(params: OpParams)(implicit spark: SparkSession): Either[RDD[Iris], Dataset[Iris]] = {
Error:(79, 6) value setInput is not a member of com.salesforce.op.stages.impl.selector.HasDecisionTreeBase[E,MS]
possible cause: maybe a semicolon is missing before `value setInput'?
    .setInput(labels, features).getOutput()
Error:(87, 53) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw, prob, labels)
Error:(87, 59) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw, prob, labels)
Error:(87, 64) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw, prob, labels)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/OpTitanic.scala
Error:(54, 45) not found: type Passenger
  val simpleReader = DataReaders.Simple.csv[Passenger](
Error:(55, 5) not found: value schema
    schema = Passenger.getClassSchema.toString, key = _.getPassengerId.toString
Error:(55, 49) not found: value key
    schema = Passenger.getClassSchema.toString, key = _.getPassengerId.toString
Error:(79, 6) value setModelsToTry is not a member of com.salesforce.op.stages.impl.selector.HasRandomForestBase[E,MS]
possible cause: maybe a semicolon is missing before `value setModelsToTry'?
    .setModelsToTry(LogisticRegression, RandomForest)
Error:(83, 53) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw)
Error:(83, 59) type mismatch;
 found   : Any
 required: com.salesforce.op.features.FeatureLike[_ <: com.salesforce.op.features.types.FeatureType]
  val workflow = new OpWorkflow().setResultFeatures(pred, raw)
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/TitanicFeatures.scala
Error:(41, 40) not found: type Passenger
  val pClass = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getPclass).map(_.toString).toPickList).asPredictor // scalastyle:off
Error:(43, 34) not found: type Passenger
  val name = FeatureBuilder.Text[Passenger].extract(d => Option(d.getName).toText).asPredictor
Error:(45, 37) not found: type Passenger
  val sex = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getSex).toPickList).asPredictor
Error:(47, 33) not found: type Passenger
  val age = FeatureBuilder.Real[Passenger].extract(d => Option(Double.unbox(d.getAge)).toReal).asPredictor
Error:(49, 39) not found: type Passenger
  val sibSp = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getSibSp).map(_.toString).toPickList).asPredictor
Error:(51, 39) not found: type Passenger
  val parch = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getParch).map(_.toString).toPickList).asPredictor
Error:(53, 40) not found: type Passenger
  val ticket = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getTicket).toPickList).asPredictor
Error:(57, 39) not found: type Passenger
  val cabin = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getCabin).toPickList).asPredictor
Error:(59, 42) not found: type Passenger
  val embarked = FeatureBuilder.PickList[Passenger].extract(d => Option(d.getEmbarked).toPickList).asPredictor
Error:(39, 40) not found: type Passenger
  val survived = FeatureBuilder.RealNN[Passenger].extract(_.getSurvived.toDouble.toRealNN).asResponse
Error:(55, 34) not found: type Passenger
  val fare = FeatureBuilder.Real[Passenger].extract(d => Option(Double.unbox(d.getFare)).toReal).asPredictor
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/OpTitanicMini.scala
Error:(40, 8) object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
Error:(66, 26) not found: value SparkSession
    implicit val spark = SparkSession.builder.config(new SparkConf()).getOrCreate()
Error:(75, 34) value transmogrify is not a member of Any
    val featureVector = features.transmogrify()
Error:(78, 36) value sanityCheck is not a member of Any
    val checkedFeatures = survived.sanityCheck(featureVector, checkSample = 1.0, removeBadFeatures = true)
Error:(78, 63) not found: value checkSample
    val checkedFeatures = survived.sanityCheck(featureVector, checkSample = 1.0, removeBadFeatures = true)
Error:(78, 82) not found: value removeBadFeatures
    val checkedFeatures = survived.sanityCheck(featureVector, checkSample = 1.0, removeBadFeatures = true)
Error:(81, 73) too many arguments for method setInput: (features: (com.salesforce.op.features.FeatureLike[com.salesforce.op.features.types.RealNN], com.salesforce.op.features.FeatureLike[com.salesforce.op.features.types.OPVector]))com.salesforce.op.stages.impl.classification.BinaryClassificationModelSelector
    val (pred, raw, prob) = BinaryClassificationModelSelector().setInput(survived, checkedFeatures).getOutput()
/Users/monk/Desktop/HelloScala/src/main/scala/com/salesforce/hw/titanic/TitanicKryoRegistrator.scala
Error:(41, 50) type Passenger is not a member of package com.salesforce.hw.titanic
    doAvroRegistration[com.salesforce.hw.titanic.Passenger](kryo)

Я попытался найти эти проблемы и обнаружил, что это может быть проблема с версией, но я не понимаю, какую версию следует использовать, если есть проблема с версией. Но если я пытаюсь запустить его из командной строки, он работает:

cd helloworld
./gradlew compileTestScala installDist
./gradlew -q sparkSubmit -Dmain=com.salesforce.hw.OpTitanicSimple -Dargs = "\
`pwd`/src/main/resources/TitanicDataset/TitanicPassengersTrainData.csv"

Он не работает с IntelliJ, как я могу решить эту проблему?

Пользовательский скаляр GraphQL
Пользовательский скаляр GraphQL
Листовые узлы системы типов GraphQL называются скалярами. Достигнув скалярного типа, невозможно спуститься дальше по иерархии типов. Скалярный тип...
Как вычислять биты и понимать побитовые операторы в Java - объяснение с примерами
Как вычислять биты и понимать побитовые операторы в Java - объяснение с примерами
В компьютерном программировании биты играют важнейшую роль в представлении и манипулировании данными на двоичном уровне. Побитовые операции...
Поднятие тревоги для долго выполняющихся методов в Spring Boot
Поднятие тревоги для долго выполняющихся методов в Spring Boot
Приходилось ли вам сталкиваться с требованиями, в которых вас могли попросить поднять тревогу или выдать ошибку, когда метод Java занимает больше...
Полный курс Java для разработчиков веб-сайтов и приложений
Полный курс Java для разработчиков веб-сайтов и приложений
Получите сертификат Java Web и Application Developer, используя наш курс.
1
0
1 381
1
Перейти к ответу Данный вопрос помечен как решенный

Ответы 1

Ответ принят как подходящий

В build.sbt отсутствуют две зависимости: spark-mllib и spark-sql.

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.3.1",
  "org.apache.spark" %% "spark-mllib" % "2.3.1",
  "org.apache.spark" %% "spark-sql" % "2.3.1",
  "com.salesforce.transmogrifai" %% "transmogrifai-core" % "0.3.4"
)

Это удалит первый блок ошибок.

Вы должны использовать «при условии». libraryDependencies ++ = Seq ("org.apache.spark" %% "spark-core"% "2.3.1"% "provided", "org.apache.spark" %% "spark-mllib"% "2.3.1 "%" provided "," org.apache.spark "%%" spark-sql "%" 2.3.1 "%" provided "," com.salesforce.transmogrifai "%%" transmogrifai-core "%" 0.3.4 ")

merenptah 10.09.2018 09:21

@Luca, два вопроса: как вы узнали, что мне нужно добавить эти зависимости. Первый блок ошибок исчез, теперь я столкнулся с notepad.pw/y37gplk1

Aaditya Ura 10.09.2018 09:34

@AyodhyankitPaul cos 'Could not access type *** in value org.apache.spark.ml.linalg находится в пакете "spark-mllib", а Could not access type *** in package org.apache.spark.sql находится в пакете "spark-sql"

Luca T. 10.09.2018 09:57

@AyodhyankitPaul за оставшиеся ошибки. Проблема в том, что Passenger - это структура данных, определенная в avro. Вам необходимо интегрировать ваш sbt-код с плагином, который генерирует класс Passenger из определения avro.

Luca T. 10.09.2018 10:02

Другие вопросы по теме