From 8b925487d3c6e5ebf1e3eb0f0cc04e7ea3d83126 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Sun, 15 Dec 2024 15:51:35 -0800 Subject: [PATCH 01/47] Add texera.io to README (#3158) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit We are excited to announce texera.io as the official website for the Texera project. - Redundant sections such as “Motivation,” “Education,” and “Videos” have been removed. - The "Goals" and “Publications” (including both computer science and interdisciplinary) section, is retained for some redundancy for now. We will consider to remove them later. - The blog has been relocated to a new dedicated home under texera.io for better accessibility and organization. The old blog site is deprecated now. --- README.md | 64 ++----------------------------------------------------- 1 file changed, 2 insertions(+), 62 deletions(-) diff --git a/README.md b/README.md index 21a62c8c2d8..7c2f1dc2c51 100644 --- a/README.md +++ b/README.md @@ -9,9 +9,9 @@

- Demo Video + Official Site | - Blogs + Blogs | Getting Started
@@ -29,13 +29,6 @@ Static Badge

-# Motivation - -* Data science is labor-intensive and particularly challenging for non-IT users applying AI/ML. -* Many workflow-based data science platforms lack parallelism, limiting their ability to handle big datasets. -* Cloud services and technologies have advanced significantly over the past decade, enabling powerful browser-based interfaces supported by high-speed networks. -* Existing data science platforms offer limited interaction during long-running jobs, making them difficult to manage after execution begins. - # Goals * Provide data science as cloud services; @@ -148,59 +141,6 @@ The workflow in the use case shown below includes data cleaning, ML model traini _In JAMIA 2021_ | [PDF](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7989302/pdf/ocab047.pdf) - -# Education - - - - - - -
- - - -

Data Science for All

- An NSF-funded summer program to teach high-school students data science and AI/ML -
- - - -

ICS 80: Data Science and AI/ML Using Workflows

- A Spring 2024 course at UCI, teaching 42 undergraduates, most of whom are not computer science majors, to learn data science and AI/ML -
- - - -

Workshop of Data Science for Everyone at Cerritos College

- A two-day workshop designed for non-CS students to learn data science and ML without a single line of coding -
- - -# Videos - - - - - - -
- - Watch the video - -

dkNET Webinar 04/26/2024

-
- - Watch the video - -

Texera Demo @ VLDB'20

-
- - Watch the video - -

Amber Presentation @ VLDB'20

-
- # Getting Started * For users, visit [Guide to Use Texera](https://github.com/Texera/texera/wiki/Getting-Started). From edaf799ca05be56d5550cb5340aa31076e1c2664 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Mon, 16 Dec 2024 12:49:19 -0800 Subject: [PATCH 02/47] Remove BulkDownloader operator (#3161) This operator's design requires some more discussion. The current issues are: 1. it is accessing the workflow context (i.e., workflowId and executionId), which may not be permitted in the future. 2. It is directly interacting with the local file system. For the short term, we will drop the support for such as downloader. We can discuss how to support it in the future when needed. --- .../assets/operator_images/BulkDownloader.png | Bin 7262 -> 0 bytes .../uci/ics/amber/operator/LogicalOp.scala | 2 - .../download/BulkDownloaderOpDesc.scala | 87 ------------------ .../download/BulkDownloaderOpExec.scala | 80 ---------------- .../download/BulkDownloaderOpExecSpec.scala | 44 --------- 5 files changed, 213 deletions(-) delete mode 100644 core/gui/src/assets/operator_images/BulkDownloader.png delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpDesc.scala delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpExec.scala delete mode 100644 core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpExecSpec.scala diff --git a/core/gui/src/assets/operator_images/BulkDownloader.png b/core/gui/src/assets/operator_images/BulkDownloader.png deleted file mode 100644 index 0b363c6ac813aa2f312fc496a87b1d2364c3c817..0000000000000000000000000000000000000000 GIT binary patch literal 0 HcmV?d00001 literal 7262 zcmd6MdmvQX`~DiEsFdy!xwVm$OUfme(J5)WpKeNFx}Xp>1~HgjKIQl(YLEN4%sJ(9 zj$BHVA*NKC3T2co24$ilL*v#kX1-gUI-gGG{Qmm=@%>|FmbKpftoK>Z`@U=KCEm@& ze(vn$vjG5e;cZ*C10aJnWq|xl>}TX;>jd^Q>-e_4App#)pZ?$)f(s90oyugpJ>(t1 zzU1%&Bp(nS9&T_nC?Mp}f#W^~!6ZMrz-l=FD*f=5O*>C!4REQ+$9$qYKPuwin9cra zvEHdYKWkK-{_Wt(t=b16n3H+^`OeMzcmMqKvc68yQU5EB^tTjOsl)o|mJPs&K)$ zs$WmUptav%xi?}Lz6~wVv9M$(Lb|#UThMxU$z(PhF_kk*de5S5lh!yPHS2NBlG z0P(sBZGW1?OfuSE1m~O7M(&rIEfmmp`Pnr$ISr{ z+wihDzb-E<=V#faFQ+0i1EqHKg7ATA@-NkUP{Tr1Nt|x&y7Ae4u&%0Pn778l^TV}Ke~QXVgCD+ zzl|`0gJflw%>P0q^!O6+ME}3IeIq&Duh5U;<9@L3i{umBJKa$H&C-yQ!fQ)nT7o$cSy|36Cmo<0e_X)qmPKXCh= z{+}`-8~;g^Z+?A8f1+or?RSy)O`Xu=f1^Lr_zzL@Rg`ZkeQSJ`$#%v58DPZWA5!C6 zvS21N<)=;lHO@Y(QyBXQocgNUM32q)Df91+{~2fU@>raG3GS}dfZ_hP?E2>T7nAAu z|F0t9RRH&MQJe3R=tnZ6U{2gg5`Hso(n{N4I|Ms58yAEW;@ zN57K&j@y@f`+H1+xf>ZHV10KPq&r=|HDQ5+Z~!vXdAaYn0irVt_`drO)a)!wfc$hU zVrYbLSJY5jbP~;mZL)ZR2nG2wsu{clNH>3yX&(4)JlI+3UK>=kkzJz2!8sk!V-p;O zSE0!k?!vH^PzriO+ajW=T#*}K!X4gw(AsBf^QOj}3%u}_V;{Psju*Tg#3vduv8M+so#Ai3mFj zv_<_;L#o<;Anw!mzinL2wqKXPk*Gxcj5Z{zo!%%9?w{Di$K5~f&kQKbo<-b!Hj-EO zTmpM}M#}eTfMxHxAi}=LUyLmlv92ggJ)jmVU8X!b_5uaOWOe89@_Qava%O#!w8WMf z`I`&9`+tRVc0|gbbp=KbIgW}(XRu|{{-~8btB`*t^xVjDBr{RmCTu#D0Aa`pEqEt> zD>;1IzGM^JhGTzhxc6{(qKl`!B;eUH!Q%{U2Fz`uidxz|i$!jne2*Ph0(%O{1cyHc z9#EgL`5hVaO|`+Yys^fRr%kf0jQl+>w>8>Jism_y4)QgZnc`Tx#~j0@(pPI+GLYr1 zQ8O81E2R~Q2G*cn2fTQw6&+)~h_&k6ab8{Ojaj~q=SJq{NCuoq;e5@=k(;fQT;!E& zks?=Xg{YKyV+1JBF@J*7)EIsRmzywmb9bvVFVQ|ceyfU&U;1<0Finj()8A@QmjWyB zikgT}JECDSbiM^|6exrIY~vn@DE~g{1g7@3u9QMmYuUwlal{w0;8Il02r${%J?-UC z_y0Vz6t8`Ik&5oWxKR?$(7>@)rTJa;lE2xZ2OGqj;G&mLBrtvY`9ZHdNnH(OAwB-M@4}*)NGMq@zov4+wC__DC=t01YIxEmiHHOW5V=V?!$) z6@Lo$uo%dN%%BKwzGh)8AnsSv7M;T;XZLBJ#&jr6;YfIz z-fG`sXRF_NtC{ORZ=|By-9>%LPW6z=OcaNOkrv=jQQ)prjzGlx6c|RG4;6G@q zl=X{Ohz1>nEf95Y3(_2`9vyk&I#kiTavX7gn>6}A-hZ^-k zpakHxWPC5>gOhg>a$+l{|8Dh=jnXSm0=VM@I)Q~ukseaJvwAZ_v9ghffy!tx2r{f5 zMYkmjo+LfTPq#D|lttHBAPpM?eU{UZOBUf?bNs;zF>Wa$pYC-IM;w;hqfwV7Q1QfG zPKG_A?Q8=+_t|O@NgHOcx^IZ~9Fu@KPO?tISn{BB4P{6 zJSG5?CspUq0(63H>=xFMbE*KNb~YXW^?}p z+~g#D0qN>ktYQBO*V1@R5H(FeZ(jn3jSmf8eIYsWPAVCPCR0=oySTVqGhrubS6v-i z6TMgMV`q{R2}X4r52R}DaUpFINZZdAtEL9IT5Gcpp{hB80N-$QP^Zd)E?KBb9 ztJaK*n|dK(ur3t!=RmzK>()415RTcgHt@9=m7n37*oR96gOsT^H)Cs=U6|Pzxs7mV zt49JdN6x6>X20@PQT-n~)^&0u^kVPj8{X4)RBnG>BBk>fKjS%9-1Ca%3y*Tm5#^qW zF`bAL#6xnrRqNgnT+?Xn*}QoATqc$2d}HK7KwEIaM_EqE^CsCn^&W|0<#(fc(g!-hhDu(0mhK_EAJ40LK-ctV<8e z&K2wo=ryA@mYJPV9c^ilO%X^B~dgNErebqG84P!f8inbsdlBP}U{nG8F zMg27u^gxj6_sHYj4*Ld8Zpwr))s>j}I61i}KOk`3P zv&=hk1S(^-EwOI5-a$%`=n~dCUvQXv?!(N!qmFP);mG1c8}mlM&UY&o-~y8xbB|_u zL?Xu7%w(Q=YR*9?-4r#X?-!7d`Rx>UckB>C<(M*OL=zD z5uT@SeJ?!`dzF5+5SM^?6t=<(Lt1e)SzrzP-5zBr(Bz4W>4fM#t1+0dQXjYX^_FPO z8ZAVwT*Ohu*#>kdFJR$cijSgzVW)-cGNoo}gCi;OtR7d6EWmgUToZAuaJKzhY$<_~ z`8WV8;dsp+U69{nWlDX407PyP?p%G3 zB7Ht6GhX0^)v~}%fYr0b_Ew<8=rllz~)b>ea7#UFSb^7u9q!U zhE({!F?Lm>;z)D_9JVxPM&?SuNU(-G(s@_mjts%l8^t#b*{?a)W1Y?<*Ow<4&NATP zgfElwnxMn&$1fH6pNfWU(gnXrec+(gmh-MV1D4VLG{Gp&vgHiR)2D&|OaM{SdCBB7 zJLaZ^3U^7`{rK`o86uJh3p2@Pi&sxom<>#jm=q0F?r;cx^Vq3m=_{4+CF;a>EaINV zz)dc~9J1Lxy{S-he{QOPJCReh&hmPGCKw7ri~=9Q2Htz+@+dp9a$!;sYQHoqpjZXn zpT%>f?O$lv7rR;>6gK3L2Te~teOT)1D7*q;Un?`b=eMWxf?K#{y)BS?BfoP-Bm@{< z?j={GHNyTy9J||zDWioDA)SARXu(}F+uLzm=3de6=P7V)?fHyTr?9v+FY(^1R6IEq z%egf_ICMoS??NcGr-U5-rrO7cnpyODw2=WAo)|=EoPT|KmN3 zJcxpw;qvj0iU6!s-H8>F0Z#{#D_`$E`6M3)i2sR8RwFu5Qqd23Sk=MKfdDLfGl?D0 z^Q|4>ur#c4CB?U`0Y>{LbuE4$nSW?PupQm!=xUrF%drDM;VJv~-VNY}fp%5Z$l^sO zeWxK8`C*4$8EI)HFvpI^pe()@n?mNs8f{S|Ur%R<24g2}S#V{ztaS1+8X!kwHG734&zvFEzQ|;_38*N*P1*XglG0zshE(XQ1?_d z8L(Tt9qo2dH2&2PV#mN?c~R$`t_8V9yPAS?y`P})%i>ydHghdI#!aJKuEse8lQCCPW! z2Y91tV`IsHb+XA-WPSYAs7ZkO1Hv+vEvIA>tvSq*#sh|TZ^3Jd!yuJYVt<8d>%E-i z#+kk#o?ki(5W~h?g>=2T-eZ2A6;JV#Q#gf6ZYazwR|ZcCx@r4Ug_G@JZG*=~x^hgA z+Y@`Rr1o!LBaaMV%VI7BSJ=97j)Z>tfG1nXRHS5^91+Og~2EC(8!;7O?lf@L7nAPk! zotRgB*L2zLXp}xXTfc55!&)w{Iv%@&R?EI8efqHqHt?FdHv!BwxUca9QqCeAFc~ez z?hdNLN6nsln}&O%gq+?jfc5y6Nr0)8ciRu3yvq>v2Ahfpeb=&rf~9@0I@{mhxA1lm zrVXcKrK?KOhuf1I7}HxLHGjPqAQlnBTkEQx9hFFFyo8sJWx(Ux1vK%c^OE}AZFf79 zdA+{t<^dx`Ls!zkR_TG(Ia4;-yvH34IN&>A0_UItJ4A6^A9vWRZPFaDti+L~FU7Q{ zxZ@{K#qo{Ug8_rC*FuU2MokqGUG1uk`)W=BV6-g!GQ~SqrE+3(ca3pVF`T%Kv|DjA z4wR2(Levd?isvae>NcbMGX#N{v6H)Q#81z;W8o_=dEv|YO2}yo11Ap307hXnZ17+i zow}|g3GDVv;}r&yFbqavCoB}4kPclA{uJWIq~Nn`-gM%DZ|@E`C&xHRLgROQ9E{`G zu@3AIdtyOjRxnv5?Q|8U-H)Ej@^II+F!aK5HE*qM1gz68rm?X(pQp8WF8|c3C$#}( z(ZLSF=SL-Puj4L2X*w3#?8IYH3V^tgit?09-YJWYPTP-bSFK`)REkbuo9%#;kOq%A zP2wt506cyPc|^0i8kaN!1P+vA^_kQgX0uHl!)ky;TTmKh;v`_jMr(76MZvEGa+y7= zk@M|YymqKZe!VrZES_oH0UaqWn^s*<&!klP4*CG+ab#r8d+hhw5p=|#Z|1;Dyr zvP~!^8(zu%eeaY!@s=mtK^G+7=XBf4fU@T#RGP`_McX$fQOoXb6(>N!@Ob4N@Pyf{ z8?g&bY93b4Zc|1oN?e7N9t(VQIKhL2-+z?xKj zv_{jU_ImKb&2x}l1-hbVkm0tBHa)-!TdP~OhOG`&6%MUYT-OTrjNj%%(4CZR`j68Tx+m<#RWmAOtY)79kNedAqUjDDj{krZ;GEdF{p3ZbfiL-b!=xl-BI12BP&93fSB~?GC@5>36jSR*9Hwa)mmn|hf5l{awY84ez diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala index 1c6339efcfc..ac88f098cd3 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala @@ -11,7 +11,6 @@ import edu.uci.ics.amber.operator.cartesianProduct.CartesianProductOpDesc import edu.uci.ics.amber.operator.dictionary.DictionaryMatcherOpDesc import edu.uci.ics.amber.operator.difference.DifferenceOpDesc import edu.uci.ics.amber.operator.distinct.DistinctOpDesc -import edu.uci.ics.amber.operator.download.BulkDownloaderOpDesc import edu.uci.ics.amber.operator.dummy.DummyOpDesc import edu.uci.ics.amber.operator.filter.SpecializedFilterOpDesc import edu.uci.ics.amber.operator.hashJoin.HashJoinOpDesc @@ -202,7 +201,6 @@ trait StateTransferFunc new Type(value = classOf[RedditSearchSourceOpDesc], name = "RedditSearch"), new Type(value = classOf[PythonLambdaFunctionOpDesc], name = "PythonLambdaFunction"), new Type(value = classOf[PythonTableReducerOpDesc], name = "PythonTableReducer"), - new Type(value = classOf[BulkDownloaderOpDesc], name = "BulkDownloader"), new Type(value = classOf[URLFetcherOpDesc], name = "URLFetcher"), new Type(value = classOf[CartesianProductOpDesc], name = "CartesianProduct"), new Type(value = classOf[FilledAreaPlotOpDesc], name = "FilledAreaPlot"), diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpDesc.scala deleted file mode 100644 index 212f815feaf..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpDesc.scala +++ /dev/null @@ -1,87 +0,0 @@ -package edu.uci.ics.amber.operator.download - -import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import com.google.common.base.Preconditions -import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} -import edu.uci.ics.amber.operator.LogicalOp -import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} - -class BulkDownloaderOpDesc extends LogicalOp { - - @JsonProperty(required = true) - @JsonSchemaTitle("URL Attribute") - @JsonPropertyDescription( - "Only accepts standard URL format" - ) - @AutofillAttributeName - var urlAttribute: String = _ - - @JsonProperty(required = true) - @JsonSchemaTitle("Result Attribute") - @JsonPropertyDescription( - "Attribute name for results(downloaded file paths)" - ) - var resultAttribute: String = _ - - override def getPhysicalOp( - workflowId: WorkflowIdentity, - executionId: ExecutionIdentity - ): PhysicalOp = { - PhysicalOp - .oneToOnePhysicalOp( - workflowId, - executionId, - operatorIdentifier, - OpExecInitInfo((_, _) => - new BulkDownloaderOpExec( - getContext, - urlAttribute - ) - ) - ) - .withInputPorts(operatorInfo.inputPorts) - .withOutputPorts(operatorInfo.outputPorts) - .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - operatorInfo.outputPorts.head.id -> getOutputSchema( - operatorInfo.inputPorts.map(_.id).map(inputSchemas(_)).toArray - ) - ) - ) - ) - } - - override def operatorInfo: OperatorInfo = - OperatorInfo( - userFriendlyName = "Bulk Downloader", - operatorDescription = "Download urls in a string column", - operatorGroupName = OperatorGroupConstants.UTILITY_GROUP, - inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) - ) - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.length == 1) - val inputSchema = schemas(0) - val outputSchemaBuilder = Schema.builder() - // keep the same schema from input - outputSchemaBuilder.add(inputSchema) - if (resultAttribute == null || resultAttribute.isEmpty) { - resultAttribute = urlAttribute + " result" - } - outputSchemaBuilder.add( - new Attribute( - resultAttribute, - AttributeType.STRING - ) - ) - outputSchemaBuilder.build() - } -} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpExec.scala deleted file mode 100644 index b69405d0e82..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpExec.scala +++ /dev/null @@ -1,80 +0,0 @@ -package edu.uci.ics.amber.operator.download - -import edu.uci.ics.amber.core.executor.OperatorExecutor -import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} -import edu.uci.ics.amber.core.workflow.WorkflowContext -import edu.uci.ics.amber.operator.source.fetcher.URLFetchUtil.getInputStreamFromURL - -import java.net.URL -import scala.collection.mutable -import scala.concurrent.ExecutionContext.Implicits.global -import scala.concurrent.duration._ -import scala.concurrent.{Await, Future} - -class BulkDownloaderOpExec( - workflowContext: WorkflowContext, - urlAttribute: String -) extends OperatorExecutor { - - private val downloading = new mutable.Queue[Future[TupleLike]]() - - private class DownloadResultIterator(blocking: Boolean) extends Iterator[TupleLike] { - override def hasNext: Boolean = { - if (downloading.isEmpty) { - return false - } - if (blocking) { - Await.result(downloading.head, 5.seconds) - } - downloading.head.isCompleted - } - - override def next(): TupleLike = { - downloading.dequeue().value.get.get - } - } - - override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = { - - downloading.enqueue(Future { - downloadTuple(tuple) - }) - new DownloadResultIterator(false) - } - - override def onFinish(port: Int): Iterator[TupleLike] = { - new DownloadResultIterator(true) - } - - private def downloadTuple(tuple: Tuple): TupleLike = { - TupleLike(tuple.getFields ++ Seq(downloadUrl(tuple.getField(urlAttribute)))) - } - - private def downloadUrl(url: String): String = { - try { - Await.result( - Future { - val urlObj = new URL(url) - val input = getInputStreamFromURL(urlObj) - input match { - case Some(contentStream) => - if (contentStream.available() > 0) { - val filename = - s"w${workflowContext.workflowId.id}-e${workflowContext.executionId.id}-${urlObj.getHost - .replace(".", "")}.download" - filename - } else { - throw new RuntimeException(s"content is not available for $url") - } - case None => - throw new RuntimeException(s"fetch content failed for $url") - } - }, - 5.seconds - ) - } catch { - case throwable: Throwable => s"Failed: ${throwable.getMessage}" - } - } - -} diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpExecSpec.scala deleted file mode 100644 index 6a9a1f2239b..00000000000 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/download/BulkDownloaderOpExecSpec.scala +++ /dev/null @@ -1,44 +0,0 @@ -package edu.uci.ics.amber.operator.download - -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema, Tuple} -import edu.uci.ics.amber.core.workflow.WorkflowContext -import edu.uci.ics.amber.core.workflow.WorkflowContext.{DEFAULT_EXECUTION_ID, DEFAULT_WORKFLOW_ID} -import org.scalatest.BeforeAndAfter -import org.scalatest.flatspec.AnyFlatSpec -class BulkDownloaderOpExecSpec extends AnyFlatSpec with BeforeAndAfter { - val tupleSchema: Schema = Schema - .builder() - .add(new Attribute("url", AttributeType.STRING)) - .build() - - val resultSchema: Schema = Schema - .builder() - .add(new Attribute("url", AttributeType.STRING)) - .add(new Attribute("url result", AttributeType.STRING)) - .build() - - val tuple: () => Tuple = () => - Tuple - .builder(tupleSchema) - .add(new Attribute("url", AttributeType.STRING), "http://www.google.com") - .build() - - val tuple2: () => Tuple = () => - Tuple - .builder(tupleSchema) - .add(new Attribute("url", AttributeType.STRING), "https://www.google.com") - .build() - - var opExec: BulkDownloaderOpExec = _ - before { - opExec = new BulkDownloaderOpExec( - new WorkflowContext(DEFAULT_WORKFLOW_ID, DEFAULT_EXECUTION_ID), - urlAttribute = "url" - ) - } - - it should "open" in { - opExec.open() - } - -} From 41c47bcd4690f8c37fb78cee3d16e48a70a224e9 Mon Sep 17 00:00:00 2001 From: Jiadong Bai <43344272+bobbai00@users.noreply.github.com> Date: Mon, 16 Dec 2024 14:31:06 -0800 Subject: [PATCH 03/47] Add JooqCodeGenerator to `dao` and remove `core/util` (#3160) This PR removes the obsolete `core/util` package which is used to generate jooq code. The generation logics has been moved to `dao` as class `JooqCodeGenerator` --- core/dao/build.sbt | 1 + core/dao/src/main/resources/jooq-conf.xml | 46 ++++++++++ .../ics/texera/dao/JooqCodeGenerator.scala | 54 +++++++++++ core/util/build.sbt | 18 ---- core/util/conf/jooq-conf.xml | 89 ------------------- core/util/project/build.properties | 1 - .../java/edu/uci/ics/util/RunCodegen.java | 47 ---------- 7 files changed, 101 insertions(+), 155 deletions(-) create mode 100644 core/dao/src/main/resources/jooq-conf.xml create mode 100644 core/dao/src/main/scala/edu/uci/ics/texera/dao/JooqCodeGenerator.scala delete mode 100644 core/util/build.sbt delete mode 100644 core/util/conf/jooq-conf.xml delete mode 100644 core/util/project/build.properties delete mode 100644 core/util/src/main/java/edu/uci/ics/util/RunCodegen.java diff --git a/core/dao/build.sbt b/core/dao/build.sbt index 23ec6c6dcb3..526c37be92e 100644 --- a/core/dao/build.sbt +++ b/core/dao/build.sbt @@ -88,4 +88,5 @@ libraryDependencies ++= Seq( libraryDependencies ++= Seq( "mysql" % "mysql-connector-java" % "8.0.33", // MySQL connector + "org.yaml" % "snakeyaml" % "1.30", // for reading storage config yaml file ) \ No newline at end of file diff --git a/core/dao/src/main/resources/jooq-conf.xml b/core/dao/src/main/resources/jooq-conf.xml new file mode 100644 index 00000000000..2935bce5073 --- /dev/null +++ b/core/dao/src/main/resources/jooq-conf.xml @@ -0,0 +1,46 @@ + + + + + false + + true + true + + + org.jooq.codegen.JavaGenerator + + + + org.jooq.meta.mysql.MySQLDatabase + + + texera_db + + + .* + + + (test_.*)|(ignore_.*) + + + + + + edu.uci.ics.texera.dao.jooq.generated + + + dao/src/main/scala + + + diff --git a/core/dao/src/main/scala/edu/uci/ics/texera/dao/JooqCodeGenerator.scala b/core/dao/src/main/scala/edu/uci/ics/texera/dao/JooqCodeGenerator.scala new file mode 100644 index 00000000000..f0297781434 --- /dev/null +++ b/core/dao/src/main/scala/edu/uci/ics/texera/dao/JooqCodeGenerator.scala @@ -0,0 +1,54 @@ +package edu.uci.ics.texera.dao + +import org.jooq.codegen.GenerationTool +import org.jooq.meta.jaxb.{Configuration, Jdbc} +import org.yaml.snakeyaml.Yaml + +import java.io.InputStream +import java.nio.file.{Files, Path} +import java.util.{Map => JMap} +import scala.jdk.CollectionConverters._ + +object JooqCodeGenerator { + @throws[Exception] + def main(args: Array[String]): Unit = { + // Load jOOQ configuration XML + val jooqXmlPath: Path = + Path.of("dao").resolve("src").resolve("main").resolve("resources").resolve("jooq-conf.xml") + val jooqConfig: Configuration = GenerationTool.load(Files.newInputStream(jooqXmlPath)) + + // Load YAML configuration + val yamlConfPath: Path = Path + .of("workflow-core") + .resolve("src") + .resolve("main") + .resolve("resources") + .resolve("storage-config.yaml") + val yaml = new Yaml + val inputStream: InputStream = Files.newInputStream(yamlConfPath) + + val conf: Map[String, Any] = + yaml.load(inputStream).asInstanceOf[JMap[String, Any]].asScala.toMap + + val jdbcConfig = conf("storage") + .asInstanceOf[JMap[String, Any]] + .asScala("jdbc") + .asInstanceOf[JMap[String, Any]] + .asScala + + // Set JDBC configuration for jOOQ + val jooqJdbcConfig = new Jdbc + jooqJdbcConfig.setDriver("com.mysql.cj.jdbc.Driver") + jooqJdbcConfig.setUrl(jdbcConfig("url").toString) + jooqJdbcConfig.setUsername(jdbcConfig("username").toString) + jooqJdbcConfig.setPassword(jdbcConfig("password").toString) + + jooqConfig.setJdbc(jooqJdbcConfig) + + // Generate the code + GenerationTool.generate(jooqConfig) + + // Close input stream + inputStream.close() + } +} diff --git a/core/util/build.sbt b/core/util/build.sbt deleted file mode 100644 index 58d56c4f89f..00000000000 --- a/core/util/build.sbt +++ /dev/null @@ -1,18 +0,0 @@ -name := "util" -organization := "edu.uci.ics" -version := "0.1-SNAPSHOT" - -scalaVersion := "2.13.12" - -lazy val util = project - .in(file(".")) - .settings( - // https://mvnrepository.com/artifact/mysql/mysql-connector-java - libraryDependencies += "mysql" % "mysql-connector-java" % "8.0.23", - // https://mvnrepository.com/artifact/com.typesafe/config - libraryDependencies += "com.typesafe" % "config" % "1.4.1", - // https://mvnrepository.com/artifact/org.jooq/jooq - libraryDependencies += "org.jooq" % "jooq" % "3.14.4", - // https://mvnrepository.com/artifact/org.jooq/jooq-codegen - libraryDependencies += "org.jooq" % "jooq-codegen" % "3.12.4" - ) diff --git a/core/util/conf/jooq-conf.xml b/core/util/conf/jooq-conf.xml deleted file mode 100644 index b30e2abcce3..00000000000 --- a/core/util/conf/jooq-conf.xml +++ /dev/null @@ -1,89 +0,0 @@ - - - - false - - true - true - - - org.jooq.codegen.JavaGenerator - - - - org.jooq.meta.mysql.MySQLDatabase - - - texera_db - - - .* - - - (test_.*)|(ignore_.*) - - - - - - edu.uci.ics.texera.web.model.jooq.generated - - - core/amber/src/main/scala - - - - - - false - - true - true - - - org.jooq.codegen.JavaGenerator - - - - org.jooq.meta.mysql.MySQLDatabase - - - texera_db - - - .* - - - (test_.*)|(ignore_.*) - - - - - - edu.uci.ics.texera.dao.jooq.generated - - - core/dao/src/main/scala - - - diff --git a/core/util/project/build.properties b/core/util/project/build.properties deleted file mode 100644 index bb5389da211..00000000000 --- a/core/util/project/build.properties +++ /dev/null @@ -1 +0,0 @@ -sbt.version=1.5.5 \ No newline at end of file diff --git a/core/util/src/main/java/edu/uci/ics/util/RunCodegen.java b/core/util/src/main/java/edu/uci/ics/util/RunCodegen.java deleted file mode 100644 index 3db167a5ba9..00000000000 --- a/core/util/src/main/java/edu/uci/ics/util/RunCodegen.java +++ /dev/null @@ -1,47 +0,0 @@ -package edu.uci.ics.util; - - -import com.typesafe.config.Config; -import com.typesafe.config.ConfigFactory; -import org.jooq.codegen.GenerationTool; -import org.jooq.meta.jaxb.Configuration; -import org.jooq.meta.jaxb.Jdbc; - -import java.nio.file.Files; -import java.nio.file.Path; - -/** - * This class is used to generate java classes representing the sql table in Texera database - * These auto generated classes are essential for the connection between backend and database when using JOOQ library. - *

- * Every time the table in the Texera database changes, including creating, dropping and modifying the tables, - * this class must be run to update the corresponding java classes. - *

- * Remember to change the username and password to your owns before you run this class. - *

- * The username, password and connection url is located in texera\core\conf\jdbc.conf - * The configuration file is located in texera\core\conf\jooq-conf.xml - */ -public class RunCodegen { - - public static void main(String[] args) throws Exception { - Path jooqXmlPath = Path.of("core").resolve("util").resolve("conf").resolve("jooq-conf.xml"); - Configuration jooqConfig = GenerationTool.load(Files.newInputStream(jooqXmlPath)); - - Path jdbcConfPath = Path.of("core").resolve("amber").resolve("src").resolve("main").resolve("resources").resolve("application.conf"); - Config jdbcConfig = ConfigFactory.parseFile(jdbcConfPath.toFile()); - - Jdbc jooqJdbcConfig = new Jdbc(); - jooqJdbcConfig.setDriver("com.mysql.cj.jdbc.Driver"); - jooqJdbcConfig.setUrl(jdbcConfig.getString("jdbc.url")); - jooqJdbcConfig.setUsername(jdbcConfig.getString("jdbc.username")); - jooqJdbcConfig.setPassword(jdbcConfig.getString("jdbc.password")); - jooqConfig.setJdbc(jooqJdbcConfig); - - GenerationTool.generate(jooqConfig); - } - -} - - - From fc3170a2ccbb9c2825e6ca751c4758713aadcdf5 Mon Sep 17 00:00:00 2001 From: "Kyuho (Kyu) Oh" <80994706+sixsage@users.noreply.github.com> Date: Mon, 16 Dec 2024 22:16:01 -0800 Subject: [PATCH 04/47] Address Result Panel Getting Sticky Too Easily (#3136) ## Purpose Address #3042 There was an issue where visualization results were very easy to make panels "sticky", meaning the panels would be dragged even though the user is no longer holding down the mouse. This PR aims to fix this issue. ## Changes Add logic to modify the z-index of visualization results directly. If the panel is being dragged, the visualization result's z-index becomes -1, and if the panel is not dragged, the z-index changes back to 0 so that the user can interact with the visualization. ## Demo Before ![chrome_svDxLZYYvl](https://github.com/user-attachments/assets/8299e19d-9c66-4d9c-a40f-3fed8f7823be) After ![R3oNSHxH8J](https://github.com/user-attachments/assets/65069f65-3995-435a-88f6-d7cc5b66ddf1) --------- Co-authored-by: Xinyuan Lin --- .../result-panel/result-panel.component.html | 15 ++++++----- .../result-panel/result-panel.component.ts | 27 +++++++++++++++++-- 2 files changed, 33 insertions(+), 9 deletions(-) diff --git a/core/gui/src/app/workspace/component/result-panel/result-panel.component.html b/core/gui/src/app/workspace/component/result-panel/result-panel.component.html index c707082f0cb..40169e1e069 100644 --- a/core/gui/src/app/workspace/component/result-panel/result-panel.component.html +++ b/core/gui/src/app/workspace/component/result-panel/result-panel.component.html @@ -39,6 +39,7 @@ [style.height.px]="height" (nzResize)="onResize($event)" [cdkDragFreeDragPosition]="dragPosition" + (cdkDragStarted)="handleStartDrag()" (cdkDragEnded)="handleEndDrag($event)">

    -

    - Result Panel{{operatorTitle ? ': ' + operatorTitle : ''}} -

    + *ngIf="width" + cdkDragHandle> +

    Result Panel{{operatorTitle ? ': ' + operatorTitle : ''}}

    @@ -82,7 +80,10 @@

    No results available to display.

    - +
    + +
    diff --git a/core/gui/src/app/workspace/component/result-panel/result-panel.component.ts b/core/gui/src/app/workspace/component/result-panel/result-panel.component.ts index ecc65a994a4..f845030cdf2 100644 --- a/core/gui/src/app/workspace/component/result-panel/result-panel.component.ts +++ b/core/gui/src/app/workspace/component/result-panel/result-panel.component.ts @@ -1,4 +1,14 @@ -import { ChangeDetectorRef, Component, OnInit, Type, HostListener, OnDestroy } from "@angular/core"; +import { + ChangeDetectorRef, + Component, + OnInit, + Type, + HostListener, + OnDestroy, + ViewChild, + ElementRef, + AfterViewInit, +} from "@angular/core"; import { merge } from "rxjs"; import { ExecuteWorkflowService } from "../../service/execute-workflow/execute-workflow.service"; import { WorkflowActionService } from "../../service/workflow-graph/model/workflow-action.service"; @@ -17,7 +27,7 @@ import { NzResizeEvent } from "ng-zorro-antd/resizable"; import { VisualizationFrameContentComponent } from "../visualization-panel-content/visualization-frame-content.component"; import { calculateTotalTranslate3d } from "../../../common/util/panel-dock"; import { isDefined } from "../../../common/util/predicate"; -import { CdkDragEnd } from "@angular/cdk/drag-drop"; +import { CdkDragEnd, CdkDragStart } from "@angular/cdk/drag-drop"; import { PanelService } from "../../service/panel/panel.service"; import { WorkflowCompilingService } from "../../service/compile-workflow/workflow-compiling.service"; import { CompilationState } from "../../types/workflow-compiling.interface"; @@ -36,6 +46,8 @@ export const DEFAULT_HEIGHT = 300; styleUrls: ["./result-panel.component.scss"], }) export class ResultPanelComponent implements OnInit, OnDestroy { + @ViewChild("dynamicComponent") + componentOutlets!: ElementRef; frameComponentConfigs: Map; componentInputs: {} }> = new Map(); protected readonly window = window; id = -1; @@ -316,12 +328,23 @@ export class ResultPanelComponent implements OnInit, OnDestroy { return this.returnPosition.x === this.dragPosition.x && this.returnPosition.y === this.dragPosition.y; } + handleStartDrag() { + let visualizationResult = this.componentOutlets.nativeElement.querySelector("#html-content"); + if (visualizationResult !== null) { + visualizationResult.style.zIndex = -1; + } + } + handleEndDrag({ source }: CdkDragEnd) { /** * records the most recent panel location, updating dragPosition when dragging is over */ const { x, y } = source.getFreeDragPosition(); this.dragPosition = { x: x, y: y }; + let visualizationResult = this.componentOutlets.nativeElement.querySelector("#html-content"); + if (visualizationResult !== null) { + visualizationResult.style.zIndex = 0; + } } onResize({ width, height }: NzResizeEvent) { From 8cc509859d7231c206bd1fc89b5d97a9c420b511 Mon Sep 17 00:00:00 2001 From: Jiadong Bai <43344272+bobbai00@users.noreply.github.com> Date: Tue, 17 Dec 2024 00:04:21 -0800 Subject: [PATCH 05/47] Remove redundant jooq codes and their usages in `core/amber` (#3164) As titled, this PR removes the redundant jooq codes in package `edu.uci.ics.texera.web.model.jooq.generated` in `core/amber`. All the imports related to them are now pointing to `edu.uci.ics.texera.dao.jooq.generated` in `core/dao`. --- .../ics/texera/web/ComputingUnitMaster.scala | 2 +- .../texera/web/ServletAwareConfigurator.scala | 2 +- .../ics/texera/web/auth/GuestAuthFilter.scala | 4 +- .../edu/uci/ics/texera/web/auth/JwtAuth.scala | 2 +- .../uci/ics/texera/web/auth/SessionUser.scala | 4 +- .../texera/web/auth/UserAuthenticator.scala | 4 +- .../texera/web/auth/UserRoleAuthorizer.scala | 2 +- .../model/jooq/generated/DefaultCatalog.java | 51 -- .../web/model/jooq/generated/Indexes.java | 108 ---- .../texera/web/model/jooq/generated/Keys.java | 156 ------ .../web/model/jooq/generated/Tables.java | 110 ---- .../web/model/jooq/generated/TexeraDb.java | 167 ------ .../enums/DatasetUserAccessPrivilege.java | 49 -- .../enums/ProjectUserAccessPrivilege.java | 49 -- .../model/jooq/generated/enums/UserRole.java | 51 -- .../enums/WorkflowUserAccessPrivilege.java | 49 -- .../model/jooq/generated/tables/Dataset.java | 173 ------ .../generated/tables/DatasetUserAccess.java | 157 ------ .../jooq/generated/tables/DatasetVersion.java | 173 ------ .../model/jooq/generated/tables/Project.java | 173 ------ .../generated/tables/ProjectUserAccess.java | 157 ------ .../jooq/generated/tables/PublicProject.java | 147 ----- .../web/model/jooq/generated/tables/User.java | 169 ------ .../jooq/generated/tables/UserConfig.java | 152 ----- .../model/jooq/generated/tables/Workflow.java | 169 ------ .../generated/tables/WorkflowExecutions.java | 202 ------- .../generated/tables/WorkflowOfProject.java | 151 ----- .../jooq/generated/tables/WorkflowOfUser.java | 151 ----- .../tables/WorkflowRuntimeStatistics.java | 198 ------- .../generated/tables/WorkflowUserAccess.java | 157 ------ .../tables/WorkflowUserActivity.java | 135 ----- .../generated/tables/WorkflowUserClones.java | 151 ----- .../generated/tables/WorkflowUserLikes.java | 151 ----- .../generated/tables/WorkflowVersion.java | 163 ------ .../generated/tables/WorkflowViewCount.java | 147 ----- .../generated/tables/daos/DatasetDao.java | 132 ----- .../tables/daos/DatasetUserAccessDao.java | 84 --- .../tables/daos/DatasetVersionDao.java | 132 ----- .../generated/tables/daos/ProjectDao.java | 132 ----- .../tables/daos/ProjectUserAccessDao.java | 84 --- .../tables/daos/PublicProjectDao.java | 75 --- .../generated/tables/daos/UserConfigDao.java | 83 --- .../jooq/generated/tables/daos/UserDao.java | 160 ------ .../generated/tables/daos/WorkflowDao.java | 146 ----- .../tables/daos/WorkflowExecutionsDao.java | 202 ------- .../tables/daos/WorkflowOfProjectDao.java | 69 --- .../tables/daos/WorkflowOfUserDao.java | 69 --- .../daos/WorkflowRuntimeStatisticsDao.java | 197 ------- .../tables/daos/WorkflowUserAccessDao.java | 84 --- .../tables/daos/WorkflowUserClonesDao.java | 69 --- .../tables/daos/WorkflowUserLikesDao.java | 69 --- .../tables/daos/WorkflowVersionDao.java | 104 ---- .../tables/daos/WorkflowViewCountDao.java | 75 --- .../generated/tables/interfaces/IDataset.java | 92 --- .../tables/interfaces/IDatasetUserAccess.java | 62 -- .../tables/interfaces/IDatasetVersion.java | 92 --- .../generated/tables/interfaces/IProject.java | 92 --- .../tables/interfaces/IProjectUserAccess.java | 62 -- .../tables/interfaces/IPublicProject.java | 51 -- .../generated/tables/interfaces/IUser.java | 102 ---- .../tables/interfaces/IUserConfig.java | 61 -- .../tables/interfaces/IWorkflow.java | 102 ---- .../interfaces/IWorkflowExecutions.java | 142 ----- .../tables/interfaces/IWorkflowOfProject.java | 51 -- .../tables/interfaces/IWorkflowOfUser.java | 51 -- .../IWorkflowRuntimeStatistics.java | 143 ----- .../interfaces/IWorkflowUserAccess.java | 62 -- .../interfaces/IWorkflowUserActivity.java | 82 --- .../interfaces/IWorkflowUserClones.java | 51 -- .../tables/interfaces/IWorkflowUserLikes.java | 51 -- .../tables/interfaces/IWorkflowVersion.java | 72 --- .../tables/interfaces/IWorkflowViewCount.java | 51 -- .../jooq/generated/tables/pojos/Dataset.java | 150 ----- .../tables/pojos/DatasetUserAccess.java | 101 ---- .../tables/pojos/DatasetVersion.java | 150 ----- .../jooq/generated/tables/pojos/Project.java | 150 ----- .../tables/pojos/ProjectUserAccess.java | 101 ---- .../generated/tables/pojos/PublicProject.java | 84 --- .../jooq/generated/tables/pojos/User.java | 165 ------ .../generated/tables/pojos/UserConfig.java | 100 ---- .../jooq/generated/tables/pojos/Workflow.java | 166 ------ .../tables/pojos/WorkflowExecutions.java | 230 -------- .../tables/pojos/WorkflowOfProject.java | 84 --- .../tables/pojos/WorkflowOfUser.java | 84 --- .../pojos/WorkflowRuntimeStatistics.java | 231 -------- .../tables/pojos/WorkflowUserAccess.java | 101 ---- .../tables/pojos/WorkflowUserActivity.java | 134 ----- .../tables/pojos/WorkflowUserClones.java | 84 --- .../tables/pojos/WorkflowUserLikes.java | 84 --- .../tables/pojos/WorkflowVersion.java | 118 ---- .../tables/pojos/WorkflowViewCount.java | 84 --- .../tables/records/DatasetRecord.java | 327 ----------- .../records/DatasetUserAccessRecord.java | 206 ------- .../tables/records/DatasetVersionRecord.java | 327 ----------- .../tables/records/ProjectRecord.java | 327 ----------- .../records/ProjectUserAccessRecord.java | 206 ------- .../tables/records/PublicProjectRecord.java | 165 ------ .../tables/records/UserConfigRecord.java | 205 ------- .../generated/tables/records/UserRecord.java | 366 ------------ .../records/WorkflowExecutionsRecord.java | 527 ----------------- .../records/WorkflowOfProjectRecord.java | 164 ------ .../tables/records/WorkflowOfUserRecord.java | 164 ------ .../tables/records/WorkflowRecord.java | 367 ------------ .../WorkflowRuntimeStatisticsRecord.java | 528 ------------------ .../records/WorkflowUserAccessRecord.java | 206 ------- .../records/WorkflowUserActivityRecord.java | 277 --------- .../records/WorkflowUserClonesRecord.java | 164 ------ .../records/WorkflowUserLikesRecord.java | 164 ------ .../tables/records/WorkflowVersionRecord.java | 247 -------- .../records/WorkflowViewCountRecord.java | 165 ------ .../web/resource/CollaborationResource.scala | 2 +- .../web/resource/UserConfigResource.scala | 6 +- .../resource/WorkflowWebsocketResource.scala | 2 +- .../web/resource/auth/AuthResource.scala | 8 +- .../resource/auth/GoogleAuthResource.scala | 6 +- .../dashboard/DashboardResource.scala | 4 +- .../dashboard/DatasetSearchQueryBuilder.scala | 8 +- .../dashboard/ProjectSearchQueryBuilder.scala | 4 +- .../dashboard/UnifiedResourceSchema.scala | 2 +- .../WorkflowSearchQueryBuilder.scala | 4 +- .../execution/AdminExecutionResource.scala | 2 +- .../admin/user/AdminUserResource.scala | 6 +- .../hub/workflow/HubWorkflowResource.scala | 6 +- .../user/dataset/DatasetAccessResource.scala | 16 +- .../user/dataset/DatasetResource.scala | 14 +- .../utils/DatasetStatisticsUtils.scala | 2 +- .../user/project/ProjectAccessResource.scala | 12 +- .../user/project/ProjectResource.scala | 8 +- .../user/project/PublicProjectResource.scala | 11 +- .../user/quota/UserQuotaResource.scala | 2 +- .../workflow/WorkflowAccessResource.scala | 8 +- .../workflow/WorkflowExecutionsResource.scala | 6 +- .../user/workflow/WorkflowResource.scala | 8 +- .../workflow/WorkflowVersionResource.scala | 6 +- .../web/service/ExecutionStatsService.scala | 2 +- .../ExecutionsMetadataPersistService.scala | 4 +- .../web/service/ResultExportService.scala | 2 +- .../texera/web/service/WorkflowService.scala | 2 +- .../dashboard/file/WorkflowResourceSpec.scala | 8 +- 139 files changed, 90 insertions(+), 14908 deletions(-) delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/DefaultCatalog.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Indexes.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Keys.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Tables.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/TexeraDb.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/DatasetUserAccessPrivilege.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/ProjectUserAccessPrivilege.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/UserRole.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/WorkflowUserAccessPrivilege.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Dataset.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/DatasetUserAccess.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/DatasetVersion.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Project.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/ProjectUserAccess.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/PublicProject.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/User.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/UserConfig.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Workflow.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowExecutions.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowOfProject.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowOfUser.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowRuntimeStatistics.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserAccess.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserActivity.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserClones.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserLikes.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowVersion.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowViewCount.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetUserAccessDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetVersionDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/ProjectDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/ProjectUserAccessDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/PublicProjectDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/UserConfigDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/UserDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowExecutionsDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowOfProjectDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowOfUserDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowRuntimeStatisticsDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserAccessDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserClonesDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserLikesDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowVersionDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowViewCountDao.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDataset.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDatasetUserAccess.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDatasetVersion.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IProject.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IProjectUserAccess.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IPublicProject.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IUser.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IUserConfig.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflow.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowExecutions.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowOfProject.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowOfUser.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowRuntimeStatistics.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserAccess.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserActivity.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserClones.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserLikes.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowVersion.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowViewCount.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Dataset.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/DatasetUserAccess.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/DatasetVersion.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Project.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/ProjectUserAccess.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/PublicProject.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/User.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/UserConfig.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Workflow.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowExecutions.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowOfProject.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowOfUser.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowRuntimeStatistics.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserAccess.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserActivity.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserClones.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserLikes.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowVersion.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowViewCount.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetUserAccessRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetVersionRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/ProjectRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/ProjectUserAccessRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/PublicProjectRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/UserConfigRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/UserRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowExecutionsRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowOfProjectRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowOfUserRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowRuntimeStatisticsRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserAccessRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserActivityRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserClonesRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserLikesRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowVersionRecord.java delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowViewCountRecord.java diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/ComputingUnitMaster.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/ComputingUnitMaster.scala index 27bd59ddf8d..eba2c81df24 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/ComputingUnitMaster.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/ComputingUnitMaster.scala @@ -17,7 +17,7 @@ import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage import edu.uci.ics.amber.engine.common.{AmberConfig, AmberRuntime, Utils} import edu.uci.ics.amber.virtualidentity.ExecutionIdentity import edu.uci.ics.texera.web.auth.JwtAuth.setupJwtAuth -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowExecutions +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.WorkflowExecutions import edu.uci.ics.texera.web.resource.WorkflowWebsocketResource import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowExecutionsResource import edu.uci.ics.texera.web.service.ExecutionsMetadataPersistService diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/ServletAwareConfigurator.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/ServletAwareConfigurator.scala index 28c89fa123c..ce0e65913bc 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/ServletAwareConfigurator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/ServletAwareConfigurator.scala @@ -2,7 +2,7 @@ package edu.uci.ics.texera.web import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.texera.web.auth.JwtAuth.jwtConsumer -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import org.apache.http.client.utils.URLEncodedUtils import org.jooq.types.UInteger diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/GuestAuthFilter.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/GuestAuthFilter.scala index 474406bf9fe..318a12c8f2d 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/GuestAuthFilter.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/GuestAuthFilter.scala @@ -1,8 +1,8 @@ package edu.uci.ics.texera.web.auth import edu.uci.ics.texera.web.auth.GuestAuthFilter.GUEST -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.enums.UserRole +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import io.dropwizard.auth.AuthFilter import java.io.IOException diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/JwtAuth.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/JwtAuth.scala index 0cd5c5978a8..de67708c794 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/JwtAuth.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/JwtAuth.scala @@ -3,7 +3,7 @@ package edu.uci.ics.texera.web.auth import com.github.toastshaman.dropwizard.auth.jwt.JwtAuthFilter import com.typesafe.config.Config import edu.uci.ics.amber.engine.common.AmberConfig -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import io.dropwizard.auth.AuthDynamicFeature import io.dropwizard.setup.Environment import org.jose4j.jws.AlgorithmIdentifiers.HMAC_SHA256 diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/SessionUser.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/SessionUser.scala index 62445306b8c..baee51b24fb 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/SessionUser.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/SessionUser.scala @@ -1,7 +1,7 @@ package edu.uci.ics.texera.web.auth -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.enums.UserRole +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import org.jooq.types.UInteger import java.security.Principal diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/UserAuthenticator.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/UserAuthenticator.scala index 883edac54da..2a4947dc2ce 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/UserAuthenticator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/UserAuthenticator.scala @@ -1,8 +1,8 @@ package edu.uci.ics.texera.web.auth import com.typesafe.scalalogging.LazyLogging -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.enums.UserRole +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import io.dropwizard.auth.Authenticator import org.jooq.types.UInteger import org.jose4j.jwt.consumer.JwtContext diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/UserRoleAuthorizer.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/UserRoleAuthorizer.scala index bd2583bcb1c..67ceade63ab 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/UserRoleAuthorizer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/auth/UserRoleAuthorizer.scala @@ -1,6 +1,6 @@ package edu.uci.ics.texera.web.auth -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole +import edu.uci.ics.texera.dao.jooq.generated.enums.UserRole import io.dropwizard.auth.Authorizer object UserRoleAuthorizer extends Authorizer[SessionUser] { diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/DefaultCatalog.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/DefaultCatalog.java deleted file mode 100644 index 4d86516a86d..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/DefaultCatalog.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated; - - -import org.jooq.Schema; -import org.jooq.impl.CatalogImpl; - -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DefaultCatalog extends CatalogImpl { - - private static final long serialVersionUID = -625238328; - - /** - * The reference instance of - */ - public static final DefaultCatalog DEFAULT_CATALOG = new DefaultCatalog(); - - /** - * The schema texera_db. - */ - public final TexeraDb TEXERA_DB = edu.uci.ics.texera.web.model.jooq.generated.TexeraDb.TEXERA_DB; - - /** - * No further instances allowed - */ - private DefaultCatalog() { - super(""); - } - - @Override - public final List getSchemas() { - List result = new ArrayList(); - result.addAll(getSchemas0()); - return result; - } - - private final List getSchemas0() { - return Arrays.asList( - TexeraDb.TEXERA_DB); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Indexes.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Indexes.java deleted file mode 100644 index 674a105bb5c..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Indexes.java +++ /dev/null @@ -1,108 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.*; -import org.jooq.Index; -import org.jooq.OrderField; -import org.jooq.impl.Internal; - - -/** - * A class modelling indexes of tables of the texera_db schema. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class Indexes { - - // ------------------------------------------------------------------------- - // INDEX definitions - // ------------------------------------------------------------------------- - - public static final Index DATASET_IDX_DATASET_NAME_DESCRIPTION = Indexes0.DATASET_IDX_DATASET_NAME_DESCRIPTION; - public static final Index DATASET_OWNER_UID = Indexes0.DATASET_OWNER_UID; - public static final Index DATASET_PRIMARY = Indexes0.DATASET_PRIMARY; - public static final Index DATASET_USER_ACCESS_PRIMARY = Indexes0.DATASET_USER_ACCESS_PRIMARY; - public static final Index DATASET_USER_ACCESS_UID = Indexes0.DATASET_USER_ACCESS_UID; - public static final Index DATASET_VERSION_DID = Indexes0.DATASET_VERSION_DID; - public static final Index DATASET_VERSION_IDX_DATASET_VERSION_NAME = Indexes0.DATASET_VERSION_IDX_DATASET_VERSION_NAME; - public static final Index DATASET_VERSION_PRIMARY = Indexes0.DATASET_VERSION_PRIMARY; - public static final Index PROJECT_IDX_USER_PROJECT_NAME_DESCRIPTION = Indexes0.PROJECT_IDX_USER_PROJECT_NAME_DESCRIPTION; - public static final Index PROJECT_OWNER_ID = Indexes0.PROJECT_OWNER_ID; - public static final Index PROJECT_PRIMARY = Indexes0.PROJECT_PRIMARY; - public static final Index PROJECT_USER_ACCESS_PID = Indexes0.PROJECT_USER_ACCESS_PID; - public static final Index PROJECT_USER_ACCESS_PRIMARY = Indexes0.PROJECT_USER_ACCESS_PRIMARY; - public static final Index PUBLIC_PROJECT_PRIMARY = Indexes0.PUBLIC_PROJECT_PRIMARY; - public static final Index USER_EMAIL = Indexes0.USER_EMAIL; - public static final Index USER_GOOGLE_ID = Indexes0.USER_GOOGLE_ID; - public static final Index USER_IDX_USER_NAME = Indexes0.USER_IDX_USER_NAME; - public static final Index USER_PRIMARY = Indexes0.USER_PRIMARY; - public static final Index USER_CONFIG_PRIMARY = Indexes0.USER_CONFIG_PRIMARY; - public static final Index WORKFLOW_IDX_WORKFLOW_NAME_DESCRIPTION_CONTENT = Indexes0.WORKFLOW_IDX_WORKFLOW_NAME_DESCRIPTION_CONTENT; - public static final Index WORKFLOW_PRIMARY = Indexes0.WORKFLOW_PRIMARY; - public static final Index WORKFLOW_EXECUTIONS_PRIMARY = Indexes0.WORKFLOW_EXECUTIONS_PRIMARY; - public static final Index WORKFLOW_EXECUTIONS_UID = Indexes0.WORKFLOW_EXECUTIONS_UID; - public static final Index WORKFLOW_EXECUTIONS_VID = Indexes0.WORKFLOW_EXECUTIONS_VID; - public static final Index WORKFLOW_OF_PROJECT_PID = Indexes0.WORKFLOW_OF_PROJECT_PID; - public static final Index WORKFLOW_OF_PROJECT_PRIMARY = Indexes0.WORKFLOW_OF_PROJECT_PRIMARY; - public static final Index WORKFLOW_OF_USER_PRIMARY = Indexes0.WORKFLOW_OF_USER_PRIMARY; - public static final Index WORKFLOW_OF_USER_WID = Indexes0.WORKFLOW_OF_USER_WID; - public static final Index WORKFLOW_RUNTIME_STATISTICS_EXECUTION_ID = Indexes0.WORKFLOW_RUNTIME_STATISTICS_EXECUTION_ID; - public static final Index WORKFLOW_RUNTIME_STATISTICS_PRIMARY = Indexes0.WORKFLOW_RUNTIME_STATISTICS_PRIMARY; - public static final Index WORKFLOW_USER_ACCESS_PRIMARY = Indexes0.WORKFLOW_USER_ACCESS_PRIMARY; - public static final Index WORKFLOW_USER_ACCESS_WID = Indexes0.WORKFLOW_USER_ACCESS_WID; - public static final Index WORKFLOW_USER_CLONES_PRIMARY = Indexes0.WORKFLOW_USER_CLONES_PRIMARY; - public static final Index WORKFLOW_USER_CLONES_WID = Indexes0.WORKFLOW_USER_CLONES_WID; - public static final Index WORKFLOW_USER_LIKES_PRIMARY = Indexes0.WORKFLOW_USER_LIKES_PRIMARY; - public static final Index WORKFLOW_USER_LIKES_WID = Indexes0.WORKFLOW_USER_LIKES_WID; - public static final Index WORKFLOW_VERSION_PRIMARY = Indexes0.WORKFLOW_VERSION_PRIMARY; - public static final Index WORKFLOW_VERSION_WID = Indexes0.WORKFLOW_VERSION_WID; - public static final Index WORKFLOW_VIEW_COUNT_PRIMARY = Indexes0.WORKFLOW_VIEW_COUNT_PRIMARY; - - // ------------------------------------------------------------------------- - // [#1459] distribute members to avoid static initialisers > 64kb - // ------------------------------------------------------------------------- - - private static class Indexes0 { - public static Index DATASET_IDX_DATASET_NAME_DESCRIPTION = Internal.createIndex("idx_dataset_name_description", Dataset.DATASET, new OrderField[]{Dataset.DATASET.NAME, Dataset.DATASET.DESCRIPTION}, false); - public static Index DATASET_OWNER_UID = Internal.createIndex("owner_uid", Dataset.DATASET, new OrderField[]{Dataset.DATASET.OWNER_UID}, false); - public static Index DATASET_PRIMARY = Internal.createIndex("PRIMARY", Dataset.DATASET, new OrderField[]{Dataset.DATASET.DID}, true); - public static Index DATASET_USER_ACCESS_PRIMARY = Internal.createIndex("PRIMARY", DatasetUserAccess.DATASET_USER_ACCESS, new OrderField[]{DatasetUserAccess.DATASET_USER_ACCESS.DID, DatasetUserAccess.DATASET_USER_ACCESS.UID}, true); - public static Index DATASET_USER_ACCESS_UID = Internal.createIndex("uid", DatasetUserAccess.DATASET_USER_ACCESS, new OrderField[]{DatasetUserAccess.DATASET_USER_ACCESS.UID}, false); - public static Index DATASET_VERSION_DID = Internal.createIndex("did", DatasetVersion.DATASET_VERSION, new OrderField[]{DatasetVersion.DATASET_VERSION.DID}, false); - public static Index DATASET_VERSION_IDX_DATASET_VERSION_NAME = Internal.createIndex("idx_dataset_version_name", DatasetVersion.DATASET_VERSION, new OrderField[]{DatasetVersion.DATASET_VERSION.NAME}, false); - public static Index DATASET_VERSION_PRIMARY = Internal.createIndex("PRIMARY", DatasetVersion.DATASET_VERSION, new OrderField[]{DatasetVersion.DATASET_VERSION.DVID}, true); - public static Index PROJECT_IDX_USER_PROJECT_NAME_DESCRIPTION = Internal.createIndex("idx_user_project_name_description", Project.PROJECT, new OrderField[]{Project.PROJECT.NAME, Project.PROJECT.DESCRIPTION}, false); - public static Index PROJECT_OWNER_ID = Internal.createIndex("owner_id", Project.PROJECT, new OrderField[]{Project.PROJECT.OWNER_ID, Project.PROJECT.NAME}, true); - public static Index PROJECT_PRIMARY = Internal.createIndex("PRIMARY", Project.PROJECT, new OrderField[]{Project.PROJECT.PID}, true); - public static Index PROJECT_USER_ACCESS_PID = Internal.createIndex("pid", ProjectUserAccess.PROJECT_USER_ACCESS, new OrderField[]{ProjectUserAccess.PROJECT_USER_ACCESS.PID}, false); - public static Index PROJECT_USER_ACCESS_PRIMARY = Internal.createIndex("PRIMARY", ProjectUserAccess.PROJECT_USER_ACCESS, new OrderField[]{ProjectUserAccess.PROJECT_USER_ACCESS.UID, ProjectUserAccess.PROJECT_USER_ACCESS.PID}, true); - public static Index PUBLIC_PROJECT_PRIMARY = Internal.createIndex("PRIMARY", PublicProject.PUBLIC_PROJECT, new OrderField[]{PublicProject.PUBLIC_PROJECT.PID}, true); - public static Index USER_EMAIL = Internal.createIndex("email", User.USER, new OrderField[]{User.USER.EMAIL}, true); - public static Index USER_GOOGLE_ID = Internal.createIndex("google_id", User.USER, new OrderField[]{User.USER.GOOGLE_ID}, true); - public static Index USER_IDX_USER_NAME = Internal.createIndex("idx_user_name", User.USER, new OrderField[]{User.USER.NAME}, false); - public static Index USER_PRIMARY = Internal.createIndex("PRIMARY", User.USER, new OrderField[]{User.USER.UID}, true); - public static Index USER_CONFIG_PRIMARY = Internal.createIndex("PRIMARY", UserConfig.USER_CONFIG, new OrderField[]{UserConfig.USER_CONFIG.UID, UserConfig.USER_CONFIG.KEY}, true); - public static Index WORKFLOW_IDX_WORKFLOW_NAME_DESCRIPTION_CONTENT = Internal.createIndex("idx_workflow_name_description_content", Workflow.WORKFLOW, new OrderField[]{Workflow.WORKFLOW.NAME, Workflow.WORKFLOW.DESCRIPTION, Workflow.WORKFLOW.CONTENT}, false); - public static Index WORKFLOW_PRIMARY = Internal.createIndex("PRIMARY", Workflow.WORKFLOW, new OrderField[]{Workflow.WORKFLOW.WID}, true); - public static Index WORKFLOW_EXECUTIONS_PRIMARY = Internal.createIndex("PRIMARY", WorkflowExecutions.WORKFLOW_EXECUTIONS, new OrderField[]{WorkflowExecutions.WORKFLOW_EXECUTIONS.EID}, true); - public static Index WORKFLOW_EXECUTIONS_UID = Internal.createIndex("uid", WorkflowExecutions.WORKFLOW_EXECUTIONS, new OrderField[]{WorkflowExecutions.WORKFLOW_EXECUTIONS.UID}, false); - public static Index WORKFLOW_EXECUTIONS_VID = Internal.createIndex("vid", WorkflowExecutions.WORKFLOW_EXECUTIONS, new OrderField[]{WorkflowExecutions.WORKFLOW_EXECUTIONS.VID}, false); - public static Index WORKFLOW_OF_PROJECT_PID = Internal.createIndex("pid", WorkflowOfProject.WORKFLOW_OF_PROJECT, new OrderField[]{WorkflowOfProject.WORKFLOW_OF_PROJECT.PID}, false); - public static Index WORKFLOW_OF_PROJECT_PRIMARY = Internal.createIndex("PRIMARY", WorkflowOfProject.WORKFLOW_OF_PROJECT, new OrderField[]{WorkflowOfProject.WORKFLOW_OF_PROJECT.WID, WorkflowOfProject.WORKFLOW_OF_PROJECT.PID}, true); - public static Index WORKFLOW_OF_USER_PRIMARY = Internal.createIndex("PRIMARY", WorkflowOfUser.WORKFLOW_OF_USER, new OrderField[]{WorkflowOfUser.WORKFLOW_OF_USER.UID, WorkflowOfUser.WORKFLOW_OF_USER.WID}, true); - public static Index WORKFLOW_OF_USER_WID = Internal.createIndex("wid", WorkflowOfUser.WORKFLOW_OF_USER, new OrderField[]{WorkflowOfUser.WORKFLOW_OF_USER.WID}, false); - public static Index WORKFLOW_RUNTIME_STATISTICS_EXECUTION_ID = Internal.createIndex("execution_id", WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS, new OrderField[]{WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.EXECUTION_ID}, false); - public static Index WORKFLOW_RUNTIME_STATISTICS_PRIMARY = Internal.createIndex("PRIMARY", WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS, new OrderField[]{WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.WORKFLOW_ID, WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.EXECUTION_ID, WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.OPERATOR_ID, WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.TIME}, true); - public static Index WORKFLOW_USER_ACCESS_PRIMARY = Internal.createIndex("PRIMARY", WorkflowUserAccess.WORKFLOW_USER_ACCESS, new OrderField[]{WorkflowUserAccess.WORKFLOW_USER_ACCESS.UID, WorkflowUserAccess.WORKFLOW_USER_ACCESS.WID}, true); - public static Index WORKFLOW_USER_ACCESS_WID = Internal.createIndex("wid", WorkflowUserAccess.WORKFLOW_USER_ACCESS, new OrderField[]{WorkflowUserAccess.WORKFLOW_USER_ACCESS.WID}, false); - public static Index WORKFLOW_USER_CLONES_PRIMARY = Internal.createIndex("PRIMARY", WorkflowUserClones.WORKFLOW_USER_CLONES, new OrderField[]{WorkflowUserClones.WORKFLOW_USER_CLONES.UID, WorkflowUserClones.WORKFLOW_USER_CLONES.WID}, true); - public static Index WORKFLOW_USER_CLONES_WID = Internal.createIndex("wid", WorkflowUserClones.WORKFLOW_USER_CLONES, new OrderField[]{WorkflowUserClones.WORKFLOW_USER_CLONES.WID}, false); - public static Index WORKFLOW_USER_LIKES_PRIMARY = Internal.createIndex("PRIMARY", WorkflowUserLikes.WORKFLOW_USER_LIKES, new OrderField[]{WorkflowUserLikes.WORKFLOW_USER_LIKES.UID, WorkflowUserLikes.WORKFLOW_USER_LIKES.WID}, true); - public static Index WORKFLOW_USER_LIKES_WID = Internal.createIndex("wid", WorkflowUserLikes.WORKFLOW_USER_LIKES, new OrderField[]{WorkflowUserLikes.WORKFLOW_USER_LIKES.WID}, false); - public static Index WORKFLOW_VERSION_PRIMARY = Internal.createIndex("PRIMARY", WorkflowVersion.WORKFLOW_VERSION, new OrderField[]{WorkflowVersion.WORKFLOW_VERSION.VID}, true); - public static Index WORKFLOW_VERSION_WID = Internal.createIndex("wid", WorkflowVersion.WORKFLOW_VERSION, new OrderField[]{WorkflowVersion.WORKFLOW_VERSION.WID}, false); - public static Index WORKFLOW_VIEW_COUNT_PRIMARY = Internal.createIndex("PRIMARY", WorkflowViewCount.WORKFLOW_VIEW_COUNT, new OrderField[]{WorkflowViewCount.WORKFLOW_VIEW_COUNT.WID}, true); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Keys.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Keys.java deleted file mode 100644 index 3e60490c110..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Keys.java +++ /dev/null @@ -1,156 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.*; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.*; -import org.jooq.ForeignKey; -import org.jooq.Identity; -import org.jooq.UniqueKey; -import org.jooq.impl.Internal; -import org.jooq.types.UInteger; - - -/** - * A class modelling foreign key relationships and constraints of tables of - * the texera_db schema. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class Keys { - - // ------------------------------------------------------------------------- - // IDENTITY definitions - // ------------------------------------------------------------------------- - - public static final Identity IDENTITY_DATASET = Identities0.IDENTITY_DATASET; - public static final Identity IDENTITY_DATASET_VERSION = Identities0.IDENTITY_DATASET_VERSION; - public static final Identity IDENTITY_PROJECT = Identities0.IDENTITY_PROJECT; - public static final Identity IDENTITY_USER = Identities0.IDENTITY_USER; - public static final Identity IDENTITY_WORKFLOW = Identities0.IDENTITY_WORKFLOW; - public static final Identity IDENTITY_WORKFLOW_EXECUTIONS = Identities0.IDENTITY_WORKFLOW_EXECUTIONS; - public static final Identity IDENTITY_WORKFLOW_VERSION = Identities0.IDENTITY_WORKFLOW_VERSION; - - // ------------------------------------------------------------------------- - // UNIQUE and PRIMARY KEY definitions - // ------------------------------------------------------------------------- - - public static final UniqueKey KEY_DATASET_PRIMARY = UniqueKeys0.KEY_DATASET_PRIMARY; - public static final UniqueKey KEY_DATASET_USER_ACCESS_PRIMARY = UniqueKeys0.KEY_DATASET_USER_ACCESS_PRIMARY; - public static final UniqueKey KEY_DATASET_VERSION_PRIMARY = UniqueKeys0.KEY_DATASET_VERSION_PRIMARY; - public static final UniqueKey KEY_PROJECT_PRIMARY = UniqueKeys0.KEY_PROJECT_PRIMARY; - public static final UniqueKey KEY_PROJECT_OWNER_ID = UniqueKeys0.KEY_PROJECT_OWNER_ID; - public static final UniqueKey KEY_PROJECT_USER_ACCESS_PRIMARY = UniqueKeys0.KEY_PROJECT_USER_ACCESS_PRIMARY; - public static final UniqueKey KEY_PUBLIC_PROJECT_PRIMARY = UniqueKeys0.KEY_PUBLIC_PROJECT_PRIMARY; - public static final UniqueKey KEY_USER_PRIMARY = UniqueKeys0.KEY_USER_PRIMARY; - public static final UniqueKey KEY_USER_EMAIL = UniqueKeys0.KEY_USER_EMAIL; - public static final UniqueKey KEY_USER_GOOGLE_ID = UniqueKeys0.KEY_USER_GOOGLE_ID; - public static final UniqueKey KEY_USER_CONFIG_PRIMARY = UniqueKeys0.KEY_USER_CONFIG_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_PRIMARY = UniqueKeys0.KEY_WORKFLOW_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_EXECUTIONS_PRIMARY = UniqueKeys0.KEY_WORKFLOW_EXECUTIONS_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_OF_PROJECT_PRIMARY = UniqueKeys0.KEY_WORKFLOW_OF_PROJECT_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_OF_USER_PRIMARY = UniqueKeys0.KEY_WORKFLOW_OF_USER_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_RUNTIME_STATISTICS_PRIMARY = UniqueKeys0.KEY_WORKFLOW_RUNTIME_STATISTICS_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_USER_ACCESS_PRIMARY = UniqueKeys0.KEY_WORKFLOW_USER_ACCESS_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_USER_CLONES_PRIMARY = UniqueKeys0.KEY_WORKFLOW_USER_CLONES_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_USER_LIKES_PRIMARY = UniqueKeys0.KEY_WORKFLOW_USER_LIKES_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_VERSION_PRIMARY = UniqueKeys0.KEY_WORKFLOW_VERSION_PRIMARY; - public static final UniqueKey KEY_WORKFLOW_VIEW_COUNT_PRIMARY = UniqueKeys0.KEY_WORKFLOW_VIEW_COUNT_PRIMARY; - - // ------------------------------------------------------------------------- - // FOREIGN KEY definitions - // ------------------------------------------------------------------------- - - public static final ForeignKey DATASET_IBFK_1 = ForeignKeys0.DATASET_IBFK_1; - public static final ForeignKey DATASET_USER_ACCESS_IBFK_1 = ForeignKeys0.DATASET_USER_ACCESS_IBFK_1; - public static final ForeignKey DATASET_USER_ACCESS_IBFK_2 = ForeignKeys0.DATASET_USER_ACCESS_IBFK_2; - public static final ForeignKey DATASET_VERSION_IBFK_1 = ForeignKeys0.DATASET_VERSION_IBFK_1; - public static final ForeignKey PROJECT_IBFK_1 = ForeignKeys0.PROJECT_IBFK_1; - public static final ForeignKey PROJECT_USER_ACCESS_IBFK_1 = ForeignKeys0.PROJECT_USER_ACCESS_IBFK_1; - public static final ForeignKey PROJECT_USER_ACCESS_IBFK_2 = ForeignKeys0.PROJECT_USER_ACCESS_IBFK_2; - public static final ForeignKey PUBLIC_PROJECT_IBFK_1 = ForeignKeys0.PUBLIC_PROJECT_IBFK_1; - public static final ForeignKey USER_CONFIG_IBFK_1 = ForeignKeys0.USER_CONFIG_IBFK_1; - public static final ForeignKey WORKFLOW_EXECUTIONS_IBFK_1 = ForeignKeys0.WORKFLOW_EXECUTIONS_IBFK_1; - public static final ForeignKey WORKFLOW_EXECUTIONS_IBFK_2 = ForeignKeys0.WORKFLOW_EXECUTIONS_IBFK_2; - public static final ForeignKey WORKFLOW_OF_PROJECT_IBFK_1 = ForeignKeys0.WORKFLOW_OF_PROJECT_IBFK_1; - public static final ForeignKey WORKFLOW_OF_PROJECT_IBFK_2 = ForeignKeys0.WORKFLOW_OF_PROJECT_IBFK_2; - public static final ForeignKey WORKFLOW_OF_USER_IBFK_1 = ForeignKeys0.WORKFLOW_OF_USER_IBFK_1; - public static final ForeignKey WORKFLOW_OF_USER_IBFK_2 = ForeignKeys0.WORKFLOW_OF_USER_IBFK_2; - public static final ForeignKey WORKFLOW_RUNTIME_STATISTICS_IBFK_1 = ForeignKeys0.WORKFLOW_RUNTIME_STATISTICS_IBFK_1; - public static final ForeignKey WORKFLOW_RUNTIME_STATISTICS_IBFK_2 = ForeignKeys0.WORKFLOW_RUNTIME_STATISTICS_IBFK_2; - public static final ForeignKey WORKFLOW_USER_ACCESS_IBFK_1 = ForeignKeys0.WORKFLOW_USER_ACCESS_IBFK_1; - public static final ForeignKey WORKFLOW_USER_ACCESS_IBFK_2 = ForeignKeys0.WORKFLOW_USER_ACCESS_IBFK_2; - public static final ForeignKey WORKFLOW_USER_CLONES_IBFK_1 = ForeignKeys0.WORKFLOW_USER_CLONES_IBFK_1; - public static final ForeignKey WORKFLOW_USER_CLONES_IBFK_2 = ForeignKeys0.WORKFLOW_USER_CLONES_IBFK_2; - public static final ForeignKey WORKFLOW_USER_LIKES_IBFK_1 = ForeignKeys0.WORKFLOW_USER_LIKES_IBFK_1; - public static final ForeignKey WORKFLOW_USER_LIKES_IBFK_2 = ForeignKeys0.WORKFLOW_USER_LIKES_IBFK_2; - public static final ForeignKey WORKFLOW_VERSION_IBFK_1 = ForeignKeys0.WORKFLOW_VERSION_IBFK_1; - public static final ForeignKey WORKFLOW_VIEW_COUNT_IBFK_1 = ForeignKeys0.WORKFLOW_VIEW_COUNT_IBFK_1; - - // ------------------------------------------------------------------------- - // [#1459] distribute members to avoid static initialisers > 64kb - // ------------------------------------------------------------------------- - - private static class Identities0 { - public static Identity IDENTITY_DATASET = Internal.createIdentity(Dataset.DATASET, Dataset.DATASET.DID); - public static Identity IDENTITY_DATASET_VERSION = Internal.createIdentity(DatasetVersion.DATASET_VERSION, DatasetVersion.DATASET_VERSION.DVID); - public static Identity IDENTITY_PROJECT = Internal.createIdentity(Project.PROJECT, Project.PROJECT.PID); - public static Identity IDENTITY_USER = Internal.createIdentity(User.USER, User.USER.UID); - public static Identity IDENTITY_WORKFLOW = Internal.createIdentity(Workflow.WORKFLOW, Workflow.WORKFLOW.WID); - public static Identity IDENTITY_WORKFLOW_EXECUTIONS = Internal.createIdentity(WorkflowExecutions.WORKFLOW_EXECUTIONS, WorkflowExecutions.WORKFLOW_EXECUTIONS.EID); - public static Identity IDENTITY_WORKFLOW_VERSION = Internal.createIdentity(WorkflowVersion.WORKFLOW_VERSION, WorkflowVersion.WORKFLOW_VERSION.VID); - } - - private static class UniqueKeys0 { - public static final UniqueKey KEY_DATASET_PRIMARY = Internal.createUniqueKey(Dataset.DATASET, "KEY_dataset_PRIMARY", Dataset.DATASET.DID); - public static final UniqueKey KEY_DATASET_USER_ACCESS_PRIMARY = Internal.createUniqueKey(DatasetUserAccess.DATASET_USER_ACCESS, "KEY_dataset_user_access_PRIMARY", DatasetUserAccess.DATASET_USER_ACCESS.DID, DatasetUserAccess.DATASET_USER_ACCESS.UID); - public static final UniqueKey KEY_DATASET_VERSION_PRIMARY = Internal.createUniqueKey(DatasetVersion.DATASET_VERSION, "KEY_dataset_version_PRIMARY", DatasetVersion.DATASET_VERSION.DVID); - public static final UniqueKey KEY_PROJECT_PRIMARY = Internal.createUniqueKey(Project.PROJECT, "KEY_project_PRIMARY", Project.PROJECT.PID); - public static final UniqueKey KEY_PROJECT_OWNER_ID = Internal.createUniqueKey(Project.PROJECT, "KEY_project_owner_id", Project.PROJECT.OWNER_ID, Project.PROJECT.NAME); - public static final UniqueKey KEY_PROJECT_USER_ACCESS_PRIMARY = Internal.createUniqueKey(ProjectUserAccess.PROJECT_USER_ACCESS, "KEY_project_user_access_PRIMARY", ProjectUserAccess.PROJECT_USER_ACCESS.UID, ProjectUserAccess.PROJECT_USER_ACCESS.PID); - public static final UniqueKey KEY_PUBLIC_PROJECT_PRIMARY = Internal.createUniqueKey(PublicProject.PUBLIC_PROJECT, "KEY_public_project_PRIMARY", PublicProject.PUBLIC_PROJECT.PID); - public static final UniqueKey KEY_USER_PRIMARY = Internal.createUniqueKey(User.USER, "KEY_user_PRIMARY", User.USER.UID); - public static final UniqueKey KEY_USER_EMAIL = Internal.createUniqueKey(User.USER, "KEY_user_email", User.USER.EMAIL); - public static final UniqueKey KEY_USER_GOOGLE_ID = Internal.createUniqueKey(User.USER, "KEY_user_google_id", User.USER.GOOGLE_ID); - public static final UniqueKey KEY_USER_CONFIG_PRIMARY = Internal.createUniqueKey(UserConfig.USER_CONFIG, "KEY_user_config_PRIMARY", UserConfig.USER_CONFIG.UID, UserConfig.USER_CONFIG.KEY); - public static final UniqueKey KEY_WORKFLOW_PRIMARY = Internal.createUniqueKey(Workflow.WORKFLOW, "KEY_workflow_PRIMARY", Workflow.WORKFLOW.WID); - public static final UniqueKey KEY_WORKFLOW_EXECUTIONS_PRIMARY = Internal.createUniqueKey(WorkflowExecutions.WORKFLOW_EXECUTIONS, "KEY_workflow_executions_PRIMARY", WorkflowExecutions.WORKFLOW_EXECUTIONS.EID); - public static final UniqueKey KEY_WORKFLOW_OF_PROJECT_PRIMARY = Internal.createUniqueKey(WorkflowOfProject.WORKFLOW_OF_PROJECT, "KEY_workflow_of_project_PRIMARY", WorkflowOfProject.WORKFLOW_OF_PROJECT.WID, WorkflowOfProject.WORKFLOW_OF_PROJECT.PID); - public static final UniqueKey KEY_WORKFLOW_OF_USER_PRIMARY = Internal.createUniqueKey(WorkflowOfUser.WORKFLOW_OF_USER, "KEY_workflow_of_user_PRIMARY", WorkflowOfUser.WORKFLOW_OF_USER.UID, WorkflowOfUser.WORKFLOW_OF_USER.WID); - public static final UniqueKey KEY_WORKFLOW_RUNTIME_STATISTICS_PRIMARY = Internal.createUniqueKey(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS, "KEY_workflow_runtime_statistics_PRIMARY", WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.WORKFLOW_ID, WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.EXECUTION_ID, WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.OPERATOR_ID, WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.TIME); - public static final UniqueKey KEY_WORKFLOW_USER_ACCESS_PRIMARY = Internal.createUniqueKey(WorkflowUserAccess.WORKFLOW_USER_ACCESS, "KEY_workflow_user_access_PRIMARY", WorkflowUserAccess.WORKFLOW_USER_ACCESS.UID, WorkflowUserAccess.WORKFLOW_USER_ACCESS.WID); - public static final UniqueKey KEY_WORKFLOW_USER_CLONES_PRIMARY = Internal.createUniqueKey(WorkflowUserClones.WORKFLOW_USER_CLONES, "KEY_workflow_user_clones_PRIMARY", WorkflowUserClones.WORKFLOW_USER_CLONES.UID, WorkflowUserClones.WORKFLOW_USER_CLONES.WID); - public static final UniqueKey KEY_WORKFLOW_USER_LIKES_PRIMARY = Internal.createUniqueKey(WorkflowUserLikes.WORKFLOW_USER_LIKES, "KEY_workflow_user_likes_PRIMARY", WorkflowUserLikes.WORKFLOW_USER_LIKES.UID, WorkflowUserLikes.WORKFLOW_USER_LIKES.WID); - public static final UniqueKey KEY_WORKFLOW_VERSION_PRIMARY = Internal.createUniqueKey(WorkflowVersion.WORKFLOW_VERSION, "KEY_workflow_version_PRIMARY", WorkflowVersion.WORKFLOW_VERSION.VID); - public static final UniqueKey KEY_WORKFLOW_VIEW_COUNT_PRIMARY = Internal.createUniqueKey(WorkflowViewCount.WORKFLOW_VIEW_COUNT, "KEY_workflow_view_count_PRIMARY", WorkflowViewCount.WORKFLOW_VIEW_COUNT.WID); - } - - private static class ForeignKeys0 { - public static final ForeignKey DATASET_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, Dataset.DATASET, "dataset_ibfk_1", Dataset.DATASET.OWNER_UID); - public static final ForeignKey DATASET_USER_ACCESS_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_DATASET_PRIMARY, DatasetUserAccess.DATASET_USER_ACCESS, "dataset_user_access_ibfk_1", DatasetUserAccess.DATASET_USER_ACCESS.DID); - public static final ForeignKey DATASET_USER_ACCESS_IBFK_2 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, DatasetUserAccess.DATASET_USER_ACCESS, "dataset_user_access_ibfk_2", DatasetUserAccess.DATASET_USER_ACCESS.UID); - public static final ForeignKey DATASET_VERSION_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_DATASET_PRIMARY, DatasetVersion.DATASET_VERSION, "dataset_version_ibfk_1", DatasetVersion.DATASET_VERSION.DID); - public static final ForeignKey PROJECT_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, Project.PROJECT, "project_ibfk_1", Project.PROJECT.OWNER_ID); - public static final ForeignKey PROJECT_USER_ACCESS_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, ProjectUserAccess.PROJECT_USER_ACCESS, "project_user_access_ibfk_1", ProjectUserAccess.PROJECT_USER_ACCESS.UID); - public static final ForeignKey PROJECT_USER_ACCESS_IBFK_2 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_PROJECT_PRIMARY, ProjectUserAccess.PROJECT_USER_ACCESS, "project_user_access_ibfk_2", ProjectUserAccess.PROJECT_USER_ACCESS.PID); - public static final ForeignKey PUBLIC_PROJECT_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_PROJECT_PRIMARY, PublicProject.PUBLIC_PROJECT, "public_project_ibfk_1", PublicProject.PUBLIC_PROJECT.PID); - public static final ForeignKey USER_CONFIG_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, UserConfig.USER_CONFIG, "user_config_ibfk_1", UserConfig.USER_CONFIG.UID); - public static final ForeignKey WORKFLOW_EXECUTIONS_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_VERSION_PRIMARY, WorkflowExecutions.WORKFLOW_EXECUTIONS, "workflow_executions_ibfk_1", WorkflowExecutions.WORKFLOW_EXECUTIONS.VID); - public static final ForeignKey WORKFLOW_EXECUTIONS_IBFK_2 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, WorkflowExecutions.WORKFLOW_EXECUTIONS, "workflow_executions_ibfk_2", WorkflowExecutions.WORKFLOW_EXECUTIONS.UID); - public static final ForeignKey WORKFLOW_OF_PROJECT_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_PRIMARY, WorkflowOfProject.WORKFLOW_OF_PROJECT, "workflow_of_project_ibfk_1", WorkflowOfProject.WORKFLOW_OF_PROJECT.WID); - public static final ForeignKey WORKFLOW_OF_PROJECT_IBFK_2 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_PROJECT_PRIMARY, WorkflowOfProject.WORKFLOW_OF_PROJECT, "workflow_of_project_ibfk_2", WorkflowOfProject.WORKFLOW_OF_PROJECT.PID); - public static final ForeignKey WORKFLOW_OF_USER_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, WorkflowOfUser.WORKFLOW_OF_USER, "workflow_of_user_ibfk_1", WorkflowOfUser.WORKFLOW_OF_USER.UID); - public static final ForeignKey WORKFLOW_OF_USER_IBFK_2 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_PRIMARY, WorkflowOfUser.WORKFLOW_OF_USER, "workflow_of_user_ibfk_2", WorkflowOfUser.WORKFLOW_OF_USER.WID); - public static final ForeignKey WORKFLOW_RUNTIME_STATISTICS_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_PRIMARY, WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS, "workflow_runtime_statistics_ibfk_1", WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.WORKFLOW_ID); - public static final ForeignKey WORKFLOW_RUNTIME_STATISTICS_IBFK_2 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_EXECUTIONS_PRIMARY, WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS, "workflow_runtime_statistics_ibfk_2", WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.EXECUTION_ID); - public static final ForeignKey WORKFLOW_USER_ACCESS_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, WorkflowUserAccess.WORKFLOW_USER_ACCESS, "workflow_user_access_ibfk_1", WorkflowUserAccess.WORKFLOW_USER_ACCESS.UID); - public static final ForeignKey WORKFLOW_USER_ACCESS_IBFK_2 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_PRIMARY, WorkflowUserAccess.WORKFLOW_USER_ACCESS, "workflow_user_access_ibfk_2", WorkflowUserAccess.WORKFLOW_USER_ACCESS.WID); - public static final ForeignKey WORKFLOW_USER_CLONES_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, WorkflowUserClones.WORKFLOW_USER_CLONES, "workflow_user_clones_ibfk_1", WorkflowUserClones.WORKFLOW_USER_CLONES.UID); - public static final ForeignKey WORKFLOW_USER_CLONES_IBFK_2 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_PRIMARY, WorkflowUserClones.WORKFLOW_USER_CLONES, "workflow_user_clones_ibfk_2", WorkflowUserClones.WORKFLOW_USER_CLONES.WID); - public static final ForeignKey WORKFLOW_USER_LIKES_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_USER_PRIMARY, WorkflowUserLikes.WORKFLOW_USER_LIKES, "workflow_user_likes_ibfk_1", WorkflowUserLikes.WORKFLOW_USER_LIKES.UID); - public static final ForeignKey WORKFLOW_USER_LIKES_IBFK_2 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_PRIMARY, WorkflowUserLikes.WORKFLOW_USER_LIKES, "workflow_user_likes_ibfk_2", WorkflowUserLikes.WORKFLOW_USER_LIKES.WID); - public static final ForeignKey WORKFLOW_VERSION_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_PRIMARY, WorkflowVersion.WORKFLOW_VERSION, "workflow_version_ibfk_1", WorkflowVersion.WORKFLOW_VERSION.WID); - public static final ForeignKey WORKFLOW_VIEW_COUNT_IBFK_1 = Internal.createForeignKey(edu.uci.ics.texera.web.model.jooq.generated.Keys.KEY_WORKFLOW_PRIMARY, WorkflowViewCount.WORKFLOW_VIEW_COUNT, "workflow_view_count_ibfk_1", WorkflowViewCount.WORKFLOW_VIEW_COUNT.WID); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Tables.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Tables.java deleted file mode 100644 index 2308ffcc240..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/Tables.java +++ /dev/null @@ -1,110 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.*; - - -/** - * Convenience access to all tables in texera_db - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class Tables { - - /** - * The table texera_db.dataset. - */ - public static final Dataset DATASET = Dataset.DATASET; - - /** - * The table texera_db.dataset_user_access. - */ - public static final DatasetUserAccess DATASET_USER_ACCESS = DatasetUserAccess.DATASET_USER_ACCESS; - - /** - * The table texera_db.dataset_version. - */ - public static final DatasetVersion DATASET_VERSION = DatasetVersion.DATASET_VERSION; - - /** - * The table texera_db.project. - */ - public static final Project PROJECT = Project.PROJECT; - - /** - * The table texera_db.project_user_access. - */ - public static final ProjectUserAccess PROJECT_USER_ACCESS = ProjectUserAccess.PROJECT_USER_ACCESS; - - /** - * The table texera_db.public_project. - */ - public static final PublicProject PUBLIC_PROJECT = PublicProject.PUBLIC_PROJECT; - - /** - * The table texera_db.user. - */ - public static final User USER = User.USER; - - /** - * The table texera_db.user_config. - */ - public static final UserConfig USER_CONFIG = UserConfig.USER_CONFIG; - - /** - * The table texera_db.workflow. - */ - public static final Workflow WORKFLOW = Workflow.WORKFLOW; - - /** - * The table texera_db.workflow_executions. - */ - public static final WorkflowExecutions WORKFLOW_EXECUTIONS = WorkflowExecutions.WORKFLOW_EXECUTIONS; - - /** - * The table texera_db.workflow_of_project. - */ - public static final WorkflowOfProject WORKFLOW_OF_PROJECT = WorkflowOfProject.WORKFLOW_OF_PROJECT; - - /** - * The table texera_db.workflow_of_user. - */ - public static final WorkflowOfUser WORKFLOW_OF_USER = WorkflowOfUser.WORKFLOW_OF_USER; - - /** - * The table texera_db.workflow_runtime_statistics. - */ - public static final WorkflowRuntimeStatistics WORKFLOW_RUNTIME_STATISTICS = WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS; - - /** - * The table texera_db.workflow_user_access. - */ - public static final WorkflowUserAccess WORKFLOW_USER_ACCESS = WorkflowUserAccess.WORKFLOW_USER_ACCESS; - - /** - * The table texera_db.workflow_user_activity. - */ - public static final WorkflowUserActivity WORKFLOW_USER_ACTIVITY = WorkflowUserActivity.WORKFLOW_USER_ACTIVITY; - - /** - * The table texera_db.workflow_user_clones. - */ - public static final WorkflowUserClones WORKFLOW_USER_CLONES = WorkflowUserClones.WORKFLOW_USER_CLONES; - - /** - * The table texera_db.workflow_user_likes. - */ - public static final WorkflowUserLikes WORKFLOW_USER_LIKES = WorkflowUserLikes.WORKFLOW_USER_LIKES; - - /** - * The table texera_db.workflow_version. - */ - public static final WorkflowVersion WORKFLOW_VERSION = WorkflowVersion.WORKFLOW_VERSION; - - /** - * The table texera_db.workflow_view_count. - */ - public static final WorkflowViewCount WORKFLOW_VIEW_COUNT = WorkflowViewCount.WORKFLOW_VIEW_COUNT; -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/TexeraDb.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/TexeraDb.java deleted file mode 100644 index e0b271b46ac..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/TexeraDb.java +++ /dev/null @@ -1,167 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.*; -import org.jooq.Catalog; -import org.jooq.Table; -import org.jooq.impl.SchemaImpl; - -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class TexeraDb extends SchemaImpl { - - private static final long serialVersionUID = 2026314588; - - /** - * The reference instance of texera_db - */ - public static final TexeraDb TEXERA_DB = new TexeraDb(); - - /** - * The table texera_db.dataset. - */ - public final Dataset DATASET = edu.uci.ics.texera.web.model.jooq.generated.tables.Dataset.DATASET; - - /** - * The table texera_db.dataset_user_access. - */ - public final DatasetUserAccess DATASET_USER_ACCESS = edu.uci.ics.texera.web.model.jooq.generated.tables.DatasetUserAccess.DATASET_USER_ACCESS; - - /** - * The table texera_db.dataset_version. - */ - public final DatasetVersion DATASET_VERSION = edu.uci.ics.texera.web.model.jooq.generated.tables.DatasetVersion.DATASET_VERSION; - - /** - * The table texera_db.project. - */ - public final Project PROJECT = edu.uci.ics.texera.web.model.jooq.generated.tables.Project.PROJECT; - - /** - * The table texera_db.project_user_access. - */ - public final ProjectUserAccess PROJECT_USER_ACCESS = edu.uci.ics.texera.web.model.jooq.generated.tables.ProjectUserAccess.PROJECT_USER_ACCESS; - - /** - * The table texera_db.public_project. - */ - public final PublicProject PUBLIC_PROJECT = edu.uci.ics.texera.web.model.jooq.generated.tables.PublicProject.PUBLIC_PROJECT; - - /** - * The table texera_db.user. - */ - public final User USER = edu.uci.ics.texera.web.model.jooq.generated.tables.User.USER; - - /** - * The table texera_db.user_config. - */ - public final UserConfig USER_CONFIG = edu.uci.ics.texera.web.model.jooq.generated.tables.UserConfig.USER_CONFIG; - - /** - * The table texera_db.workflow. - */ - public final Workflow WORKFLOW = edu.uci.ics.texera.web.model.jooq.generated.tables.Workflow.WORKFLOW; - - /** - * The table texera_db.workflow_executions. - */ - public final WorkflowExecutions WORKFLOW_EXECUTIONS = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowExecutions.WORKFLOW_EXECUTIONS; - - /** - * The table texera_db.workflow_of_project. - */ - public final WorkflowOfProject WORKFLOW_OF_PROJECT = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowOfProject.WORKFLOW_OF_PROJECT; - - /** - * The table texera_db.workflow_of_user. - */ - public final WorkflowOfUser WORKFLOW_OF_USER = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowOfUser.WORKFLOW_OF_USER; - - /** - * The table texera_db.workflow_runtime_statistics. - */ - public final WorkflowRuntimeStatistics WORKFLOW_RUNTIME_STATISTICS = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS; - - /** - * The table texera_db.workflow_user_access. - */ - public final WorkflowUserAccess WORKFLOW_USER_ACCESS = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserAccess.WORKFLOW_USER_ACCESS; - - /** - * The table texera_db.workflow_user_activity. - */ - public final WorkflowUserActivity WORKFLOW_USER_ACTIVITY = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserActivity.WORKFLOW_USER_ACTIVITY; - - /** - * The table texera_db.workflow_user_clones. - */ - public final WorkflowUserClones WORKFLOW_USER_CLONES = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserClones.WORKFLOW_USER_CLONES; - - /** - * The table texera_db.workflow_user_likes. - */ - public final WorkflowUserLikes WORKFLOW_USER_LIKES = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserLikes.WORKFLOW_USER_LIKES; - - /** - * The table texera_db.workflow_version. - */ - public final WorkflowVersion WORKFLOW_VERSION = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowVersion.WORKFLOW_VERSION; - - /** - * The table texera_db.workflow_view_count. - */ - public final WorkflowViewCount WORKFLOW_VIEW_COUNT = edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowViewCount.WORKFLOW_VIEW_COUNT; - - /** - * No further instances allowed - */ - private TexeraDb() { - super("texera_db", null); - } - - - @Override - public Catalog getCatalog() { - return DefaultCatalog.DEFAULT_CATALOG; - } - - @Override - public final List> getTables() { - List result = new ArrayList(); - result.addAll(getTables0()); - return result; - } - - private final List> getTables0() { - return Arrays.>asList( - Dataset.DATASET, - DatasetUserAccess.DATASET_USER_ACCESS, - DatasetVersion.DATASET_VERSION, - Project.PROJECT, - ProjectUserAccess.PROJECT_USER_ACCESS, - PublicProject.PUBLIC_PROJECT, - User.USER, - UserConfig.USER_CONFIG, - Workflow.WORKFLOW, - WorkflowExecutions.WORKFLOW_EXECUTIONS, - WorkflowOfProject.WORKFLOW_OF_PROJECT, - WorkflowOfUser.WORKFLOW_OF_USER, - WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS, - WorkflowUserAccess.WORKFLOW_USER_ACCESS, - WorkflowUserActivity.WORKFLOW_USER_ACTIVITY, - WorkflowUserClones.WORKFLOW_USER_CLONES, - WorkflowUserLikes.WORKFLOW_USER_LIKES, - WorkflowVersion.WORKFLOW_VERSION, - WorkflowViewCount.WORKFLOW_VIEW_COUNT); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/DatasetUserAccessPrivilege.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/DatasetUserAccessPrivilege.java deleted file mode 100644 index 54fc70fa4e2..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/DatasetUserAccessPrivilege.java +++ /dev/null @@ -1,49 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.enums; - - -import org.jooq.Catalog; -import org.jooq.EnumType; -import org.jooq.Schema; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public enum DatasetUserAccessPrivilege implements EnumType { - - NONE("NONE"), - - READ("READ"), - - WRITE("WRITE"); - - private final String literal; - - private DatasetUserAccessPrivilege(String literal) { - this.literal = literal; - } - - @Override - public Catalog getCatalog() { - return null; - } - - @Override - public Schema getSchema() { - return null; - } - - @Override - public String getName() { - return "dataset_user_access_privilege"; - } - - @Override - public String getLiteral() { - return literal; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/ProjectUserAccessPrivilege.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/ProjectUserAccessPrivilege.java deleted file mode 100644 index 72e671bb516..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/ProjectUserAccessPrivilege.java +++ /dev/null @@ -1,49 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.enums; - - -import org.jooq.Catalog; -import org.jooq.EnumType; -import org.jooq.Schema; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public enum ProjectUserAccessPrivilege implements EnumType { - - NONE("NONE"), - - READ("READ"), - - WRITE("WRITE"); - - private final String literal; - - private ProjectUserAccessPrivilege(String literal) { - this.literal = literal; - } - - @Override - public Catalog getCatalog() { - return null; - } - - @Override - public Schema getSchema() { - return null; - } - - @Override - public String getName() { - return "project_user_access_privilege"; - } - - @Override - public String getLiteral() { - return literal; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/UserRole.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/UserRole.java deleted file mode 100644 index a115af8e3d8..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/UserRole.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.enums; - - -import org.jooq.Catalog; -import org.jooq.EnumType; -import org.jooq.Schema; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public enum UserRole implements EnumType { - - INACTIVE("INACTIVE"), - - RESTRICTED("RESTRICTED"), - - REGULAR("REGULAR"), - - ADMIN("ADMIN"); - - private final String literal; - - private UserRole(String literal) { - this.literal = literal; - } - - @Override - public Catalog getCatalog() { - return null; - } - - @Override - public Schema getSchema() { - return null; - } - - @Override - public String getName() { - return "user_role"; - } - - @Override - public String getLiteral() { - return literal; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/WorkflowUserAccessPrivilege.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/WorkflowUserAccessPrivilege.java deleted file mode 100644 index dff22660276..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/enums/WorkflowUserAccessPrivilege.java +++ /dev/null @@ -1,49 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.enums; - - -import org.jooq.Catalog; -import org.jooq.EnumType; -import org.jooq.Schema; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public enum WorkflowUserAccessPrivilege implements EnumType { - - NONE("NONE"), - - READ("READ"), - - WRITE("WRITE"); - - private final String literal; - - private WorkflowUserAccessPrivilege(String literal) { - this.literal = literal; - } - - @Override - public Catalog getCatalog() { - return null; - } - - @Override - public Schema getSchema() { - return null; - } - - @Override - public String getName() { - return "workflow_user_access_privilege"; - } - - @Override - public String getLiteral() { - return literal; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Dataset.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Dataset.java deleted file mode 100644 index 4192e6ccb4f..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Dataset.java +++ /dev/null @@ -1,173 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.DatasetRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class Dataset extends TableImpl { - - private static final long serialVersionUID = 1571657241; - - /** - * The reference instance of texera_db.dataset - */ - public static final Dataset DATASET = new Dataset(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return DatasetRecord.class; - } - - /** - * The column texera_db.dataset.did. - */ - public final TableField DID = createField(DSL.name("did"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).identity(true), this, ""); - - /** - * The column texera_db.dataset.owner_uid. - */ - public final TableField OWNER_UID = createField(DSL.name("owner_uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.dataset.name. - */ - public final TableField NAME = createField(DSL.name("name"), org.jooq.impl.SQLDataType.VARCHAR(128).nullable(false), this, ""); - - /** - * The column texera_db.dataset.is_public. - */ - public final TableField IS_PUBLIC = createField(DSL.name("is_public"), org.jooq.impl.SQLDataType.TINYINT.nullable(false).defaultValue(org.jooq.impl.DSL.inline("1", org.jooq.impl.SQLDataType.TINYINT)), this, ""); - - /** - * The column texera_db.dataset.description. - */ - public final TableField DESCRIPTION = createField(DSL.name("description"), org.jooq.impl.SQLDataType.VARCHAR(512).nullable(false), this, ""); - - /** - * The column texera_db.dataset.creation_time. - */ - public final TableField CREATION_TIME = createField(DSL.name("creation_time"), org.jooq.impl.SQLDataType.TIMESTAMP.nullable(false).defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); - - /** - * Create a texera_db.dataset table reference - */ - public Dataset() { - this(DSL.name("dataset"), null); - } - - /** - * Create an aliased texera_db.dataset table reference - */ - public Dataset(String alias) { - this(DSL.name(alias), DATASET); - } - - /** - * Create an aliased texera_db.dataset table reference - */ - public Dataset(Name alias) { - this(alias, DATASET); - } - - private Dataset(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private Dataset(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public Dataset(Table child, ForeignKey key) { - super(child, key, DATASET); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.DATASET_IDX_DATASET_NAME_DESCRIPTION, Indexes.DATASET_OWNER_UID, Indexes.DATASET_PRIMARY); - } - - @Override - public Identity getIdentity() { - return Keys.IDENTITY_DATASET; - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_DATASET_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_DATASET_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.DATASET_IBFK_1); - } - - public User user() { - return new User(this, Keys.DATASET_IBFK_1); - } - - @Override - public Dataset as(String alias) { - return new Dataset(DSL.name(alias), this); - } - - @Override - public Dataset as(Name alias) { - return new Dataset(alias, this); - } - - /** - * Rename this table - */ - @Override - public Dataset rename(String name) { - return new Dataset(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public Dataset rename(Name name) { - return new Dataset(name, null); - } - - // ------------------------------------------------------------------------- - // Row6 type methods - // ------------------------------------------------------------------------- - - @Override - public Row6 fieldsRow() { - return (Row6) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/DatasetUserAccess.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/DatasetUserAccess.java deleted file mode 100644 index 63f48822646..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/DatasetUserAccess.java +++ /dev/null @@ -1,157 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.enums.DatasetUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.DatasetUserAccessRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetUserAccess extends TableImpl { - - private static final long serialVersionUID = -996212423; - - /** - * The reference instance of texera_db.dataset_user_access - */ - public static final DatasetUserAccess DATASET_USER_ACCESS = new DatasetUserAccess(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return DatasetUserAccessRecord.class; - } - - /** - * The column texera_db.dataset_user_access.did. - */ - public final TableField DID = createField(DSL.name("did"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.dataset_user_access.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.dataset_user_access.privilege. - */ - public final TableField PRIVILEGE = createField(DSL.name("privilege"), org.jooq.impl.SQLDataType.VARCHAR(5).nullable(false).defaultValue(org.jooq.impl.DSL.inline("NONE", org.jooq.impl.SQLDataType.VARCHAR)).asEnumDataType(edu.uci.ics.texera.web.model.jooq.generated.enums.DatasetUserAccessPrivilege.class), this, ""); - - /** - * Create a texera_db.dataset_user_access table reference - */ - public DatasetUserAccess() { - this(DSL.name("dataset_user_access"), null); - } - - /** - * Create an aliased texera_db.dataset_user_access table reference - */ - public DatasetUserAccess(String alias) { - this(DSL.name(alias), DATASET_USER_ACCESS); - } - - /** - * Create an aliased texera_db.dataset_user_access table reference - */ - public DatasetUserAccess(Name alias) { - this(alias, DATASET_USER_ACCESS); - } - - private DatasetUserAccess(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private DatasetUserAccess(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public DatasetUserAccess(Table child, ForeignKey key) { - super(child, key, DATASET_USER_ACCESS); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.DATASET_USER_ACCESS_PRIMARY, Indexes.DATASET_USER_ACCESS_UID); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_DATASET_USER_ACCESS_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_DATASET_USER_ACCESS_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.DATASET_USER_ACCESS_IBFK_1, Keys.DATASET_USER_ACCESS_IBFK_2); - } - - public Dataset dataset() { - return new Dataset(this, Keys.DATASET_USER_ACCESS_IBFK_1); - } - - public User user() { - return new User(this, Keys.DATASET_USER_ACCESS_IBFK_2); - } - - @Override - public DatasetUserAccess as(String alias) { - return new DatasetUserAccess(DSL.name(alias), this); - } - - @Override - public DatasetUserAccess as(Name alias) { - return new DatasetUserAccess(alias, this); - } - - /** - * Rename this table - */ - @Override - public DatasetUserAccess rename(String name) { - return new DatasetUserAccess(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public DatasetUserAccess rename(Name name) { - return new DatasetUserAccess(name, null); - } - - // ------------------------------------------------------------------------- - // Row3 type methods - // ------------------------------------------------------------------------- - - @Override - public Row3 fieldsRow() { - return (Row3) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/DatasetVersion.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/DatasetVersion.java deleted file mode 100644 index 4615b229f1d..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/DatasetVersion.java +++ /dev/null @@ -1,173 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.DatasetVersionRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetVersion extends TableImpl { - - private static final long serialVersionUID = 25893167; - - /** - * The reference instance of texera_db.dataset_version - */ - public static final DatasetVersion DATASET_VERSION = new DatasetVersion(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return DatasetVersionRecord.class; - } - - /** - * The column texera_db.dataset_version.dvid. - */ - public final TableField DVID = createField(DSL.name("dvid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).identity(true), this, ""); - - /** - * The column texera_db.dataset_version.did. - */ - public final TableField DID = createField(DSL.name("did"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.dataset_version.creator_uid. - */ - public final TableField CREATOR_UID = createField(DSL.name("creator_uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.dataset_version.name. - */ - public final TableField NAME = createField(DSL.name("name"), org.jooq.impl.SQLDataType.VARCHAR(128).nullable(false), this, ""); - - /** - * The column texera_db.dataset_version.version_hash. - */ - public final TableField VERSION_HASH = createField(DSL.name("version_hash"), org.jooq.impl.SQLDataType.VARCHAR(64).nullable(false), this, ""); - - /** - * The column texera_db.dataset_version.creation_time. - */ - public final TableField CREATION_TIME = createField(DSL.name("creation_time"), org.jooq.impl.SQLDataType.TIMESTAMP.nullable(false).defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); - - /** - * Create a texera_db.dataset_version table reference - */ - public DatasetVersion() { - this(DSL.name("dataset_version"), null); - } - - /** - * Create an aliased texera_db.dataset_version table reference - */ - public DatasetVersion(String alias) { - this(DSL.name(alias), DATASET_VERSION); - } - - /** - * Create an aliased texera_db.dataset_version table reference - */ - public DatasetVersion(Name alias) { - this(alias, DATASET_VERSION); - } - - private DatasetVersion(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private DatasetVersion(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public DatasetVersion(Table child, ForeignKey key) { - super(child, key, DATASET_VERSION); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.DATASET_VERSION_DID, Indexes.DATASET_VERSION_IDX_DATASET_VERSION_NAME, Indexes.DATASET_VERSION_PRIMARY); - } - - @Override - public Identity getIdentity() { - return Keys.IDENTITY_DATASET_VERSION; - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_DATASET_VERSION_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_DATASET_VERSION_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.DATASET_VERSION_IBFK_1); - } - - public Dataset dataset() { - return new Dataset(this, Keys.DATASET_VERSION_IBFK_1); - } - - @Override - public DatasetVersion as(String alias) { - return new DatasetVersion(DSL.name(alias), this); - } - - @Override - public DatasetVersion as(Name alias) { - return new DatasetVersion(alias, this); - } - - /** - * Rename this table - */ - @Override - public DatasetVersion rename(String name) { - return new DatasetVersion(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public DatasetVersion rename(Name name) { - return new DatasetVersion(name, null); - } - - // ------------------------------------------------------------------------- - // Row6 type methods - // ------------------------------------------------------------------------- - - @Override - public Row6 fieldsRow() { - return (Row6) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Project.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Project.java deleted file mode 100644 index a46efd3f34e..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Project.java +++ /dev/null @@ -1,173 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.ProjectRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class Project extends TableImpl { - - private static final long serialVersionUID = 1829720653; - - /** - * The reference instance of texera_db.project - */ - public static final Project PROJECT = new Project(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return ProjectRecord.class; - } - - /** - * The column texera_db.project.pid. - */ - public final TableField PID = createField(DSL.name("pid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).identity(true), this, ""); - - /** - * The column texera_db.project.name. - */ - public final TableField NAME = createField(DSL.name("name"), org.jooq.impl.SQLDataType.VARCHAR(128).nullable(false), this, ""); - - /** - * The column texera_db.project.description. - */ - public final TableField DESCRIPTION = createField(DSL.name("description"), org.jooq.impl.SQLDataType.VARCHAR(10000), this, ""); - - /** - * The column texera_db.project.owner_id. - */ - public final TableField OWNER_ID = createField(DSL.name("owner_id"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.project.creation_time. - */ - public final TableField CREATION_TIME = createField(DSL.name("creation_time"), org.jooq.impl.SQLDataType.TIMESTAMP.nullable(false).defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); - - /** - * The column texera_db.project.color. - */ - public final TableField COLOR = createField(DSL.name("color"), org.jooq.impl.SQLDataType.VARCHAR(6), this, ""); - - /** - * Create a texera_db.project table reference - */ - public Project() { - this(DSL.name("project"), null); - } - - /** - * Create an aliased texera_db.project table reference - */ - public Project(String alias) { - this(DSL.name(alias), PROJECT); - } - - /** - * Create an aliased texera_db.project table reference - */ - public Project(Name alias) { - this(alias, PROJECT); - } - - private Project(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private Project(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public Project(Table child, ForeignKey key) { - super(child, key, PROJECT); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.PROJECT_IDX_USER_PROJECT_NAME_DESCRIPTION, Indexes.PROJECT_OWNER_ID, Indexes.PROJECT_PRIMARY); - } - - @Override - public Identity getIdentity() { - return Keys.IDENTITY_PROJECT; - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_PROJECT_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_PROJECT_PRIMARY, Keys.KEY_PROJECT_OWNER_ID); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.PROJECT_IBFK_1); - } - - public User user() { - return new User(this, Keys.PROJECT_IBFK_1); - } - - @Override - public Project as(String alias) { - return new Project(DSL.name(alias), this); - } - - @Override - public Project as(Name alias) { - return new Project(alias, this); - } - - /** - * Rename this table - */ - @Override - public Project rename(String name) { - return new Project(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public Project rename(Name name) { - return new Project(name, null); - } - - // ------------------------------------------------------------------------- - // Row6 type methods - // ------------------------------------------------------------------------- - - @Override - public Row6 fieldsRow() { - return (Row6) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/ProjectUserAccess.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/ProjectUserAccess.java deleted file mode 100644 index 1d8883ce1f6..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/ProjectUserAccess.java +++ /dev/null @@ -1,157 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.enums.ProjectUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.ProjectUserAccessRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class ProjectUserAccess extends TableImpl { - - private static final long serialVersionUID = -2015215347; - - /** - * The reference instance of texera_db.project_user_access - */ - public static final ProjectUserAccess PROJECT_USER_ACCESS = new ProjectUserAccess(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return ProjectUserAccessRecord.class; - } - - /** - * The column texera_db.project_user_access.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.project_user_access.pid. - */ - public final TableField PID = createField(DSL.name("pid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.project_user_access.privilege. - */ - public final TableField PRIVILEGE = createField(DSL.name("privilege"), org.jooq.impl.SQLDataType.VARCHAR(5).nullable(false).defaultValue(org.jooq.impl.DSL.inline("NONE", org.jooq.impl.SQLDataType.VARCHAR)).asEnumDataType(edu.uci.ics.texera.web.model.jooq.generated.enums.ProjectUserAccessPrivilege.class), this, ""); - - /** - * Create a texera_db.project_user_access table reference - */ - public ProjectUserAccess() { - this(DSL.name("project_user_access"), null); - } - - /** - * Create an aliased texera_db.project_user_access table reference - */ - public ProjectUserAccess(String alias) { - this(DSL.name(alias), PROJECT_USER_ACCESS); - } - - /** - * Create an aliased texera_db.project_user_access table reference - */ - public ProjectUserAccess(Name alias) { - this(alias, PROJECT_USER_ACCESS); - } - - private ProjectUserAccess(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private ProjectUserAccess(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public ProjectUserAccess(Table child, ForeignKey key) { - super(child, key, PROJECT_USER_ACCESS); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.PROJECT_USER_ACCESS_PID, Indexes.PROJECT_USER_ACCESS_PRIMARY); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_PROJECT_USER_ACCESS_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_PROJECT_USER_ACCESS_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.PROJECT_USER_ACCESS_IBFK_1, Keys.PROJECT_USER_ACCESS_IBFK_2); - } - - public User user() { - return new User(this, Keys.PROJECT_USER_ACCESS_IBFK_1); - } - - public Project project() { - return new Project(this, Keys.PROJECT_USER_ACCESS_IBFK_2); - } - - @Override - public ProjectUserAccess as(String alias) { - return new ProjectUserAccess(DSL.name(alias), this); - } - - @Override - public ProjectUserAccess as(Name alias) { - return new ProjectUserAccess(alias, this); - } - - /** - * Rename this table - */ - @Override - public ProjectUserAccess rename(String name) { - return new ProjectUserAccess(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public ProjectUserAccess rename(Name name) { - return new ProjectUserAccess(name, null); - } - - // ------------------------------------------------------------------------- - // Row3 type methods - // ------------------------------------------------------------------------- - - @Override - public Row3 fieldsRow() { - return (Row3) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/PublicProject.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/PublicProject.java deleted file mode 100644 index d7e0f77eed8..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/PublicProject.java +++ /dev/null @@ -1,147 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.PublicProjectRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class PublicProject extends TableImpl { - - private static final long serialVersionUID = 509034382; - - /** - * The reference instance of texera_db.public_project - */ - public static final PublicProject PUBLIC_PROJECT = new PublicProject(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return PublicProjectRecord.class; - } - - /** - * The column texera_db.public_project.pid. - */ - public final TableField PID = createField(DSL.name("pid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.public_project.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED, this, ""); - - /** - * Create a texera_db.public_project table reference - */ - public PublicProject() { - this(DSL.name("public_project"), null); - } - - /** - * Create an aliased texera_db.public_project table reference - */ - public PublicProject(String alias) { - this(DSL.name(alias), PUBLIC_PROJECT); - } - - /** - * Create an aliased texera_db.public_project table reference - */ - public PublicProject(Name alias) { - this(alias, PUBLIC_PROJECT); - } - - private PublicProject(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private PublicProject(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public PublicProject(Table child, ForeignKey key) { - super(child, key, PUBLIC_PROJECT); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.PUBLIC_PROJECT_PRIMARY); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_PUBLIC_PROJECT_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_PUBLIC_PROJECT_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.PUBLIC_PROJECT_IBFK_1); - } - - public Project project() { - return new Project(this, Keys.PUBLIC_PROJECT_IBFK_1); - } - - @Override - public PublicProject as(String alias) { - return new PublicProject(DSL.name(alias), this); - } - - @Override - public PublicProject as(Name alias) { - return new PublicProject(alias, this); - } - - /** - * Rename this table - */ - @Override - public PublicProject rename(String name) { - return new PublicProject(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public PublicProject rename(Name name) { - return new PublicProject(name, null); - } - - // ------------------------------------------------------------------------- - // Row2 type methods - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/User.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/User.java deleted file mode 100644 index 3b2e309a9e4..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/User.java +++ /dev/null @@ -1,169 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.UserRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class User extends TableImpl { - - private static final long serialVersionUID = 670965342; - - /** - * The reference instance of texera_db.user - */ - public static final User USER = new User(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return UserRecord.class; - } - - /** - * The column texera_db.user.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).identity(true), this, ""); - - /** - * The column texera_db.user.name. - */ - public final TableField NAME = createField(DSL.name("name"), org.jooq.impl.SQLDataType.VARCHAR(256).nullable(false), this, ""); - - /** - * The column texera_db.user.email. - */ - public final TableField EMAIL = createField(DSL.name("email"), org.jooq.impl.SQLDataType.VARCHAR(256), this, ""); - - /** - * The column texera_db.user.password. - */ - public final TableField PASSWORD = createField(DSL.name("password"), org.jooq.impl.SQLDataType.VARCHAR(256), this, ""); - - /** - * The column texera_db.user.google_id. - */ - public final TableField GOOGLE_ID = createField(DSL.name("google_id"), org.jooq.impl.SQLDataType.VARCHAR(256), this, ""); - - /** - * The column texera_db.user.role. - */ - public final TableField ROLE = createField(DSL.name("role"), org.jooq.impl.SQLDataType.VARCHAR(10).nullable(false).defaultValue(org.jooq.impl.DSL.inline("INACTIVE", org.jooq.impl.SQLDataType.VARCHAR)).asEnumDataType(edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole.class), this, ""); - - /** - * The column texera_db.user.google_avatar. - */ - public final TableField GOOGLE_AVATAR = createField(DSL.name("google_avatar"), org.jooq.impl.SQLDataType.VARCHAR(100), this, ""); - - /** - * Create a texera_db.user table reference - */ - public User() { - this(DSL.name("user"), null); - } - - /** - * Create an aliased texera_db.user table reference - */ - public User(String alias) { - this(DSL.name(alias), USER); - } - - /** - * Create an aliased texera_db.user table reference - */ - public User(Name alias) { - this(alias, USER); - } - - private User(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private User(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public User(Table child, ForeignKey key) { - super(child, key, USER); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.USER_EMAIL, Indexes.USER_GOOGLE_ID, Indexes.USER_IDX_USER_NAME, Indexes.USER_PRIMARY); - } - - @Override - public Identity getIdentity() { - return Keys.IDENTITY_USER; - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_USER_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_USER_PRIMARY, Keys.KEY_USER_EMAIL, Keys.KEY_USER_GOOGLE_ID); - } - - @Override - public User as(String alias) { - return new User(DSL.name(alias), this); - } - - @Override - public User as(Name alias) { - return new User(alias, this); - } - - /** - * Rename this table - */ - @Override - public User rename(String name) { - return new User(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public User rename(Name name) { - return new User(name, null); - } - - // ------------------------------------------------------------------------- - // Row7 type methods - // ------------------------------------------------------------------------- - - @Override - public Row7 fieldsRow() { - return (Row7) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/UserConfig.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/UserConfig.java deleted file mode 100644 index e671f7670b6..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/UserConfig.java +++ /dev/null @@ -1,152 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.UserConfigRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class UserConfig extends TableImpl { - - private static final long serialVersionUID = -666438922; - - /** - * The reference instance of texera_db.user_config - */ - public static final UserConfig USER_CONFIG = new UserConfig(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return UserConfigRecord.class; - } - - /** - * The column texera_db.user_config.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.user_config.key. - */ - public final TableField KEY = createField(DSL.name("key"), org.jooq.impl.SQLDataType.VARCHAR(256).nullable(false), this, ""); - - /** - * The column texera_db.user_config.value. - */ - public final TableField VALUE = createField(DSL.name("value"), org.jooq.impl.SQLDataType.CLOB.nullable(false), this, ""); - - /** - * Create a texera_db.user_config table reference - */ - public UserConfig() { - this(DSL.name("user_config"), null); - } - - /** - * Create an aliased texera_db.user_config table reference - */ - public UserConfig(String alias) { - this(DSL.name(alias), USER_CONFIG); - } - - /** - * Create an aliased texera_db.user_config table reference - */ - public UserConfig(Name alias) { - this(alias, USER_CONFIG); - } - - private UserConfig(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private UserConfig(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public UserConfig(Table child, ForeignKey key) { - super(child, key, USER_CONFIG); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.USER_CONFIG_PRIMARY); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_USER_CONFIG_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_USER_CONFIG_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.USER_CONFIG_IBFK_1); - } - - public User user() { - return new User(this, Keys.USER_CONFIG_IBFK_1); - } - - @Override - public UserConfig as(String alias) { - return new UserConfig(DSL.name(alias), this); - } - - @Override - public UserConfig as(Name alias) { - return new UserConfig(alias, this); - } - - /** - * Rename this table - */ - @Override - public UserConfig rename(String name) { - return new UserConfig(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public UserConfig rename(Name name) { - return new UserConfig(name, null); - } - - // ------------------------------------------------------------------------- - // Row3 type methods - // ------------------------------------------------------------------------- - - @Override - public Row3 fieldsRow() { - return (Row3) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Workflow.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Workflow.java deleted file mode 100644 index 57fb8c39bd3..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/Workflow.java +++ /dev/null @@ -1,169 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class Workflow extends TableImpl { - - private static final long serialVersionUID = -1770381178; - - /** - * The reference instance of texera_db.workflow - */ - public static final Workflow WORKFLOW = new Workflow(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowRecord.class; - } - - /** - * The column texera_db.workflow.name. - */ - public final TableField NAME = createField(DSL.name("name"), org.jooq.impl.SQLDataType.VARCHAR(128).nullable(false), this, ""); - - /** - * The column texera_db.workflow.description. - */ - public final TableField DESCRIPTION = createField(DSL.name("description"), org.jooq.impl.SQLDataType.VARCHAR(500), this, ""); - - /** - * The column texera_db.workflow.wid. - */ - public final TableField WID = createField(DSL.name("wid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).identity(true), this, ""); - - /** - * The column texera_db.workflow.content. - */ - public final TableField CONTENT = createField(DSL.name("content"), org.jooq.impl.SQLDataType.CLOB.nullable(false), this, ""); - - /** - * The column texera_db.workflow.creation_time. - */ - public final TableField CREATION_TIME = createField(DSL.name("creation_time"), org.jooq.impl.SQLDataType.TIMESTAMP.nullable(false).defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); - - /** - * The column texera_db.workflow.last_modified_time. - */ - public final TableField LAST_MODIFIED_TIME = createField(DSL.name("last_modified_time"), org.jooq.impl.SQLDataType.TIMESTAMP.nullable(false).defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); - - /** - * The column texera_db.workflow.is_published. - */ - public final TableField IS_PUBLISHED = createField(DSL.name("is_published"), org.jooq.impl.SQLDataType.TINYINT.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.TINYINT)), this, ""); - - /** - * Create a texera_db.workflow table reference - */ - public Workflow() { - this(DSL.name("workflow"), null); - } - - /** - * Create an aliased texera_db.workflow table reference - */ - public Workflow(String alias) { - this(DSL.name(alias), WORKFLOW); - } - - /** - * Create an aliased texera_db.workflow table reference - */ - public Workflow(Name alias) { - this(alias, WORKFLOW); - } - - private Workflow(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private Workflow(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public Workflow(Table child, ForeignKey key) { - super(child, key, WORKFLOW); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_IDX_WORKFLOW_NAME_DESCRIPTION_CONTENT, Indexes.WORKFLOW_PRIMARY); - } - - @Override - public Identity getIdentity() { - return Keys.IDENTITY_WORKFLOW; - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_PRIMARY); - } - - @Override - public Workflow as(String alias) { - return new Workflow(DSL.name(alias), this); - } - - @Override - public Workflow as(Name alias) { - return new Workflow(alias, this); - } - - /** - * Rename this table - */ - @Override - public Workflow rename(String name) { - return new Workflow(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public Workflow rename(Name name) { - return new Workflow(name, null); - } - - // ------------------------------------------------------------------------- - // Row7 type methods - // ------------------------------------------------------------------------- - - @Override - public Row7 fieldsRow() { - return (Row7) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowExecutions.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowExecutions.java deleted file mode 100644 index cf5c3166e91..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowExecutions.java +++ /dev/null @@ -1,202 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowExecutionsRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowExecutions extends TableImpl { - - private static final long serialVersionUID = -2101880159; - - /** - * The reference instance of texera_db.workflow_executions - */ - public static final WorkflowExecutions WORKFLOW_EXECUTIONS = new WorkflowExecutions(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowExecutionsRecord.class; - } - - /** - * The column texera_db.workflow_executions.eid. - */ - public final TableField EID = createField(DSL.name("eid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).identity(true), this, ""); - - /** - * The column texera_db.workflow_executions.vid. - */ - public final TableField VID = createField(DSL.name("vid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_executions.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_executions.status. - */ - public final TableField STATUS = createField(DSL.name("status"), org.jooq.impl.SQLDataType.TINYINT.nullable(false).defaultValue(org.jooq.impl.DSL.inline("1", org.jooq.impl.SQLDataType.TINYINT)), this, ""); - - /** - * The column texera_db.workflow_executions.result. - */ - public final TableField RESULT = createField(DSL.name("result"), org.jooq.impl.SQLDataType.CLOB, this, ""); - - /** - * The column texera_db.workflow_executions.starting_time. - */ - public final TableField STARTING_TIME = createField(DSL.name("starting_time"), org.jooq.impl.SQLDataType.TIMESTAMP.nullable(false).defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); - - /** - * The column texera_db.workflow_executions.last_update_time. - */ - public final TableField LAST_UPDATE_TIME = createField(DSL.name("last_update_time"), org.jooq.impl.SQLDataType.TIMESTAMP, this, ""); - - /** - * The column texera_db.workflow_executions.bookmarked. - */ - public final TableField BOOKMARKED = createField(DSL.name("bookmarked"), org.jooq.impl.SQLDataType.TINYINT.defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.TINYINT)), this, ""); - - /** - * The column texera_db.workflow_executions.name. - */ - public final TableField NAME = createField(DSL.name("name"), org.jooq.impl.SQLDataType.VARCHAR(128).nullable(false).defaultValue(org.jooq.impl.DSL.inline("Untitled Execution", org.jooq.impl.SQLDataType.VARCHAR)), this, ""); - - /** - * The column texera_db.workflow_executions.environment_version. - */ - public final TableField ENVIRONMENT_VERSION = createField(DSL.name("environment_version"), org.jooq.impl.SQLDataType.VARCHAR(128).nullable(false), this, ""); - - /** - * The column texera_db.workflow_executions.log_location. - */ - public final TableField LOG_LOCATION = createField(DSL.name("log_location"), org.jooq.impl.SQLDataType.CLOB, this, ""); - - /** - * Create a texera_db.workflow_executions table reference - */ - public WorkflowExecutions() { - this(DSL.name("workflow_executions"), null); - } - - /** - * Create an aliased texera_db.workflow_executions table reference - */ - public WorkflowExecutions(String alias) { - this(DSL.name(alias), WORKFLOW_EXECUTIONS); - } - - /** - * Create an aliased texera_db.workflow_executions table reference - */ - public WorkflowExecutions(Name alias) { - this(alias, WORKFLOW_EXECUTIONS); - } - - private WorkflowExecutions(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowExecutions(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowExecutions(Table child, ForeignKey key) { - super(child, key, WORKFLOW_EXECUTIONS); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_EXECUTIONS_PRIMARY, Indexes.WORKFLOW_EXECUTIONS_UID, Indexes.WORKFLOW_EXECUTIONS_VID); - } - - @Override - public Identity getIdentity() { - return Keys.IDENTITY_WORKFLOW_EXECUTIONS; - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_EXECUTIONS_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_EXECUTIONS_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.WORKFLOW_EXECUTIONS_IBFK_1, Keys.WORKFLOW_EXECUTIONS_IBFK_2); - } - - public WorkflowVersion workflowVersion() { - return new WorkflowVersion(this, Keys.WORKFLOW_EXECUTIONS_IBFK_1); - } - - public User user() { - return new User(this, Keys.WORKFLOW_EXECUTIONS_IBFK_2); - } - - @Override - public WorkflowExecutions as(String alias) { - return new WorkflowExecutions(DSL.name(alias), this); - } - - @Override - public WorkflowExecutions as(Name alias) { - return new WorkflowExecutions(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowExecutions rename(String name) { - return new WorkflowExecutions(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowExecutions rename(Name name) { - return new WorkflowExecutions(name, null); - } - - // ------------------------------------------------------------------------- - // Row11 type methods - // ------------------------------------------------------------------------- - - @Override - public Row11 fieldsRow() { - return (Row11) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowOfProject.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowOfProject.java deleted file mode 100644 index f631813fb55..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowOfProject.java +++ /dev/null @@ -1,151 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowOfProjectRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowOfProject extends TableImpl { - - private static final long serialVersionUID = 644137750; - - /** - * The reference instance of texera_db.workflow_of_project - */ - public static final WorkflowOfProject WORKFLOW_OF_PROJECT = new WorkflowOfProject(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowOfProjectRecord.class; - } - - /** - * The column texera_db.workflow_of_project.wid. - */ - public final TableField WID = createField(DSL.name("wid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_of_project.pid. - */ - public final TableField PID = createField(DSL.name("pid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * Create a texera_db.workflow_of_project table reference - */ - public WorkflowOfProject() { - this(DSL.name("workflow_of_project"), null); - } - - /** - * Create an aliased texera_db.workflow_of_project table reference - */ - public WorkflowOfProject(String alias) { - this(DSL.name(alias), WORKFLOW_OF_PROJECT); - } - - /** - * Create an aliased texera_db.workflow_of_project table reference - */ - public WorkflowOfProject(Name alias) { - this(alias, WORKFLOW_OF_PROJECT); - } - - private WorkflowOfProject(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowOfProject(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowOfProject(Table child, ForeignKey key) { - super(child, key, WORKFLOW_OF_PROJECT); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_OF_PROJECT_PID, Indexes.WORKFLOW_OF_PROJECT_PRIMARY); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_OF_PROJECT_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_OF_PROJECT_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.WORKFLOW_OF_PROJECT_IBFK_1, Keys.WORKFLOW_OF_PROJECT_IBFK_2); - } - - public Workflow workflow() { - return new Workflow(this, Keys.WORKFLOW_OF_PROJECT_IBFK_1); - } - - public Project project() { - return new Project(this, Keys.WORKFLOW_OF_PROJECT_IBFK_2); - } - - @Override - public WorkflowOfProject as(String alias) { - return new WorkflowOfProject(DSL.name(alias), this); - } - - @Override - public WorkflowOfProject as(Name alias) { - return new WorkflowOfProject(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowOfProject rename(String name) { - return new WorkflowOfProject(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowOfProject rename(Name name) { - return new WorkflowOfProject(name, null); - } - - // ------------------------------------------------------------------------- - // Row2 type methods - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowOfUser.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowOfUser.java deleted file mode 100644 index 60f17874db2..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowOfUser.java +++ /dev/null @@ -1,151 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowOfUserRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowOfUser extends TableImpl { - - private static final long serialVersionUID = 1187907428; - - /** - * The reference instance of texera_db.workflow_of_user - */ - public static final WorkflowOfUser WORKFLOW_OF_USER = new WorkflowOfUser(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowOfUserRecord.class; - } - - /** - * The column texera_db.workflow_of_user.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_of_user.wid. - */ - public final TableField WID = createField(DSL.name("wid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * Create a texera_db.workflow_of_user table reference - */ - public WorkflowOfUser() { - this(DSL.name("workflow_of_user"), null); - } - - /** - * Create an aliased texera_db.workflow_of_user table reference - */ - public WorkflowOfUser(String alias) { - this(DSL.name(alias), WORKFLOW_OF_USER); - } - - /** - * Create an aliased texera_db.workflow_of_user table reference - */ - public WorkflowOfUser(Name alias) { - this(alias, WORKFLOW_OF_USER); - } - - private WorkflowOfUser(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowOfUser(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowOfUser(Table child, ForeignKey key) { - super(child, key, WORKFLOW_OF_USER); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_OF_USER_PRIMARY, Indexes.WORKFLOW_OF_USER_WID); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_OF_USER_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_OF_USER_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.WORKFLOW_OF_USER_IBFK_1, Keys.WORKFLOW_OF_USER_IBFK_2); - } - - public User user() { - return new User(this, Keys.WORKFLOW_OF_USER_IBFK_1); - } - - public Workflow workflow() { - return new Workflow(this, Keys.WORKFLOW_OF_USER_IBFK_2); - } - - @Override - public WorkflowOfUser as(String alias) { - return new WorkflowOfUser(DSL.name(alias), this); - } - - @Override - public WorkflowOfUser as(Name alias) { - return new WorkflowOfUser(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowOfUser rename(String name) { - return new WorkflowOfUser(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowOfUser rename(Name name) { - return new WorkflowOfUser(name, null); - } - - // ------------------------------------------------------------------------- - // Row2 type methods - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowRuntimeStatistics.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowRuntimeStatistics.java deleted file mode 100644 index 675ba4da173..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowRuntimeStatistics.java +++ /dev/null @@ -1,198 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowRuntimeStatisticsRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; -import org.jooq.types.ULong; - -import java.sql.Timestamp; -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowRuntimeStatistics extends TableImpl { - - private static final long serialVersionUID = -1646851437; - - /** - * The reference instance of texera_db.workflow_runtime_statistics - */ - public static final WorkflowRuntimeStatistics WORKFLOW_RUNTIME_STATISTICS = new WorkflowRuntimeStatistics(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowRuntimeStatisticsRecord.class; - } - - /** - * The column texera_db.workflow_runtime_statistics.workflow_id. - */ - public final TableField WORKFLOW_ID = createField(DSL.name("workflow_id"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.execution_id. - */ - public final TableField EXECUTION_ID = createField(DSL.name("execution_id"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.operator_id. - */ - public final TableField OPERATOR_ID = createField(DSL.name("operator_id"), org.jooq.impl.SQLDataType.VARCHAR(512).nullable(false), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.time. - */ - public final TableField TIME = createField(DSL.name("time"), org.jooq.impl.SQLDataType.TIMESTAMP.nullable(false).defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP(6)", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.input_tuple_cnt. - */ - public final TableField INPUT_TUPLE_CNT = createField(DSL.name("input_tuple_cnt"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.INTEGERUNSIGNED)), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.output_tuple_cnt. - */ - public final TableField OUTPUT_TUPLE_CNT = createField(DSL.name("output_tuple_cnt"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.INTEGERUNSIGNED)), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.status. - */ - public final TableField STATUS = createField(DSL.name("status"), org.jooq.impl.SQLDataType.TINYINT.nullable(false).defaultValue(org.jooq.impl.DSL.inline("1", org.jooq.impl.SQLDataType.TINYINT)), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.data_processing_time. - */ - public final TableField DATA_PROCESSING_TIME = createField(DSL.name("data_processing_time"), org.jooq.impl.SQLDataType.BIGINTUNSIGNED.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.BIGINTUNSIGNED)), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.control_processing_time. - */ - public final TableField CONTROL_PROCESSING_TIME = createField(DSL.name("control_processing_time"), org.jooq.impl.SQLDataType.BIGINTUNSIGNED.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.BIGINTUNSIGNED)), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.idle_time. - */ - public final TableField IDLE_TIME = createField(DSL.name("idle_time"), org.jooq.impl.SQLDataType.BIGINTUNSIGNED.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.BIGINTUNSIGNED)), this, ""); - - /** - * The column texera_db.workflow_runtime_statistics.num_workers. - */ - public final TableField NUM_WORKERS = createField(DSL.name("num_workers"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.INTEGERUNSIGNED)), this, ""); - - /** - * Create a texera_db.workflow_runtime_statistics table reference - */ - public WorkflowRuntimeStatistics() { - this(DSL.name("workflow_runtime_statistics"), null); - } - - /** - * Create an aliased texera_db.workflow_runtime_statistics table reference - */ - public WorkflowRuntimeStatistics(String alias) { - this(DSL.name(alias), WORKFLOW_RUNTIME_STATISTICS); - } - - /** - * Create an aliased texera_db.workflow_runtime_statistics table reference - */ - public WorkflowRuntimeStatistics(Name alias) { - this(alias, WORKFLOW_RUNTIME_STATISTICS); - } - - private WorkflowRuntimeStatistics(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowRuntimeStatistics(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowRuntimeStatistics(Table child, ForeignKey key) { - super(child, key, WORKFLOW_RUNTIME_STATISTICS); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_RUNTIME_STATISTICS_EXECUTION_ID, Indexes.WORKFLOW_RUNTIME_STATISTICS_PRIMARY); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_RUNTIME_STATISTICS_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_RUNTIME_STATISTICS_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.WORKFLOW_RUNTIME_STATISTICS_IBFK_1, Keys.WORKFLOW_RUNTIME_STATISTICS_IBFK_2); - } - - public Workflow workflow() { - return new Workflow(this, Keys.WORKFLOW_RUNTIME_STATISTICS_IBFK_1); - } - - public WorkflowExecutions workflowExecutions() { - return new WorkflowExecutions(this, Keys.WORKFLOW_RUNTIME_STATISTICS_IBFK_2); - } - - @Override - public WorkflowRuntimeStatistics as(String alias) { - return new WorkflowRuntimeStatistics(DSL.name(alias), this); - } - - @Override - public WorkflowRuntimeStatistics as(Name alias) { - return new WorkflowRuntimeStatistics(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowRuntimeStatistics rename(String name) { - return new WorkflowRuntimeStatistics(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowRuntimeStatistics rename(Name name) { - return new WorkflowRuntimeStatistics(name, null); - } - - // ------------------------------------------------------------------------- - // Row11 type methods - // ------------------------------------------------------------------------- - - @Override - public Row11 fieldsRow() { - return (Row11) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserAccess.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserAccess.java deleted file mode 100644 index fe9d5fc1091..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserAccess.java +++ /dev/null @@ -1,157 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.enums.WorkflowUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowUserAccessRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserAccess extends TableImpl { - - private static final long serialVersionUID = 712932299; - - /** - * The reference instance of texera_db.workflow_user_access - */ - public static final WorkflowUserAccess WORKFLOW_USER_ACCESS = new WorkflowUserAccess(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowUserAccessRecord.class; - } - - /** - * The column texera_db.workflow_user_access.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_user_access.wid. - */ - public final TableField WID = createField(DSL.name("wid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_user_access.privilege. - */ - public final TableField PRIVILEGE = createField(DSL.name("privilege"), org.jooq.impl.SQLDataType.VARCHAR(5).nullable(false).defaultValue(org.jooq.impl.DSL.inline("NONE", org.jooq.impl.SQLDataType.VARCHAR)).asEnumDataType(edu.uci.ics.texera.web.model.jooq.generated.enums.WorkflowUserAccessPrivilege.class), this, ""); - - /** - * Create a texera_db.workflow_user_access table reference - */ - public WorkflowUserAccess() { - this(DSL.name("workflow_user_access"), null); - } - - /** - * Create an aliased texera_db.workflow_user_access table reference - */ - public WorkflowUserAccess(String alias) { - this(DSL.name(alias), WORKFLOW_USER_ACCESS); - } - - /** - * Create an aliased texera_db.workflow_user_access table reference - */ - public WorkflowUserAccess(Name alias) { - this(alias, WORKFLOW_USER_ACCESS); - } - - private WorkflowUserAccess(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowUserAccess(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowUserAccess(Table child, ForeignKey key) { - super(child, key, WORKFLOW_USER_ACCESS); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_USER_ACCESS_PRIMARY, Indexes.WORKFLOW_USER_ACCESS_WID); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_USER_ACCESS_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_USER_ACCESS_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.WORKFLOW_USER_ACCESS_IBFK_1, Keys.WORKFLOW_USER_ACCESS_IBFK_2); - } - - public User user() { - return new User(this, Keys.WORKFLOW_USER_ACCESS_IBFK_1); - } - - public Workflow workflow() { - return new Workflow(this, Keys.WORKFLOW_USER_ACCESS_IBFK_2); - } - - @Override - public WorkflowUserAccess as(String alias) { - return new WorkflowUserAccess(DSL.name(alias), this); - } - - @Override - public WorkflowUserAccess as(Name alias) { - return new WorkflowUserAccess(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowUserAccess rename(String name) { - return new WorkflowUserAccess(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowUserAccess rename(Name name) { - return new WorkflowUserAccess(name, null); - } - - // ------------------------------------------------------------------------- - // Row3 type methods - // ------------------------------------------------------------------------- - - @Override - public Row3 fieldsRow() { - return (Row3) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserActivity.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserActivity.java deleted file mode 100644 index 584934c3a95..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserActivity.java +++ /dev/null @@ -1,135 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowUserActivityRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserActivity extends TableImpl { - - private static final long serialVersionUID = 1692655664; - - /** - * The reference instance of texera_db.workflow_user_activity - */ - public static final WorkflowUserActivity WORKFLOW_USER_ACTIVITY = new WorkflowUserActivity(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowUserActivityRecord.class; - } - - /** - * The column texera_db.workflow_user_activity.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.INTEGERUNSIGNED)), this, ""); - - /** - * The column texera_db.workflow_user_activity.wid. - */ - public final TableField WID = createField(DSL.name("wid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_user_activity.ip. - */ - public final TableField IP = createField(DSL.name("ip"), org.jooq.impl.SQLDataType.VARCHAR(15), this, ""); - - /** - * The column texera_db.workflow_user_activity.activate. - */ - public final TableField ACTIVATE = createField(DSL.name("activate"), org.jooq.impl.SQLDataType.VARCHAR(10).nullable(false), this, ""); - - /** - * The column texera_db.workflow_user_activity.activity_time. - */ - public final TableField ACTIVITY_TIME = createField(DSL.name("activity_time"), org.jooq.impl.SQLDataType.TIMESTAMP.defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); - - /** - * Create a texera_db.workflow_user_activity table reference - */ - public WorkflowUserActivity() { - this(DSL.name("workflow_user_activity"), null); - } - - /** - * Create an aliased texera_db.workflow_user_activity table reference - */ - public WorkflowUserActivity(String alias) { - this(DSL.name(alias), WORKFLOW_USER_ACTIVITY); - } - - /** - * Create an aliased texera_db.workflow_user_activity table reference - */ - public WorkflowUserActivity(Name alias) { - this(alias, WORKFLOW_USER_ACTIVITY); - } - - private WorkflowUserActivity(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowUserActivity(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowUserActivity(Table child, ForeignKey key) { - super(child, key, WORKFLOW_USER_ACTIVITY); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public WorkflowUserActivity as(String alias) { - return new WorkflowUserActivity(DSL.name(alias), this); - } - - @Override - public WorkflowUserActivity as(Name alias) { - return new WorkflowUserActivity(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowUserActivity rename(String name) { - return new WorkflowUserActivity(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowUserActivity rename(Name name) { - return new WorkflowUserActivity(name, null); - } - - // ------------------------------------------------------------------------- - // Row5 type methods - // ------------------------------------------------------------------------- - - @Override - public Row5 fieldsRow() { - return (Row5) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserClones.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserClones.java deleted file mode 100644 index 2018735f90a..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserClones.java +++ /dev/null @@ -1,151 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowUserClonesRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserClones extends TableImpl { - - private static final long serialVersionUID = -666496644; - - /** - * The reference instance of texera_db.workflow_user_clones - */ - public static final WorkflowUserClones WORKFLOW_USER_CLONES = new WorkflowUserClones(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowUserClonesRecord.class; - } - - /** - * The column texera_db.workflow_user_clones.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_user_clones.wid. - */ - public final TableField WID = createField(DSL.name("wid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * Create a texera_db.workflow_user_clones table reference - */ - public WorkflowUserClones() { - this(DSL.name("workflow_user_clones"), null); - } - - /** - * Create an aliased texera_db.workflow_user_clones table reference - */ - public WorkflowUserClones(String alias) { - this(DSL.name(alias), WORKFLOW_USER_CLONES); - } - - /** - * Create an aliased texera_db.workflow_user_clones table reference - */ - public WorkflowUserClones(Name alias) { - this(alias, WORKFLOW_USER_CLONES); - } - - private WorkflowUserClones(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowUserClones(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowUserClones(Table child, ForeignKey key) { - super(child, key, WORKFLOW_USER_CLONES); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_USER_CLONES_PRIMARY, Indexes.WORKFLOW_USER_CLONES_WID); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_USER_CLONES_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_USER_CLONES_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.WORKFLOW_USER_CLONES_IBFK_1, Keys.WORKFLOW_USER_CLONES_IBFK_2); - } - - public User user() { - return new User(this, Keys.WORKFLOW_USER_CLONES_IBFK_1); - } - - public Workflow workflow() { - return new Workflow(this, Keys.WORKFLOW_USER_CLONES_IBFK_2); - } - - @Override - public WorkflowUserClones as(String alias) { - return new WorkflowUserClones(DSL.name(alias), this); - } - - @Override - public WorkflowUserClones as(Name alias) { - return new WorkflowUserClones(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowUserClones rename(String name) { - return new WorkflowUserClones(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowUserClones rename(Name name) { - return new WorkflowUserClones(name, null); - } - - // ------------------------------------------------------------------------- - // Row2 type methods - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserLikes.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserLikes.java deleted file mode 100644 index 11c2e8e6d1b..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowUserLikes.java +++ /dev/null @@ -1,151 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowUserLikesRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserLikes extends TableImpl { - - private static final long serialVersionUID = -470369784; - - /** - * The reference instance of texera_db.workflow_user_likes - */ - public static final WorkflowUserLikes WORKFLOW_USER_LIKES = new WorkflowUserLikes(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowUserLikesRecord.class; - } - - /** - * The column texera_db.workflow_user_likes.uid. - */ - public final TableField UID = createField(DSL.name("uid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_user_likes.wid. - */ - public final TableField WID = createField(DSL.name("wid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * Create a texera_db.workflow_user_likes table reference - */ - public WorkflowUserLikes() { - this(DSL.name("workflow_user_likes"), null); - } - - /** - * Create an aliased texera_db.workflow_user_likes table reference - */ - public WorkflowUserLikes(String alias) { - this(DSL.name(alias), WORKFLOW_USER_LIKES); - } - - /** - * Create an aliased texera_db.workflow_user_likes table reference - */ - public WorkflowUserLikes(Name alias) { - this(alias, WORKFLOW_USER_LIKES); - } - - private WorkflowUserLikes(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowUserLikes(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowUserLikes(Table child, ForeignKey key) { - super(child, key, WORKFLOW_USER_LIKES); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_USER_LIKES_PRIMARY, Indexes.WORKFLOW_USER_LIKES_WID); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_USER_LIKES_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_USER_LIKES_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.WORKFLOW_USER_LIKES_IBFK_1, Keys.WORKFLOW_USER_LIKES_IBFK_2); - } - - public User user() { - return new User(this, Keys.WORKFLOW_USER_LIKES_IBFK_1); - } - - public Workflow workflow() { - return new Workflow(this, Keys.WORKFLOW_USER_LIKES_IBFK_2); - } - - @Override - public WorkflowUserLikes as(String alias) { - return new WorkflowUserLikes(DSL.name(alias), this); - } - - @Override - public WorkflowUserLikes as(Name alias) { - return new WorkflowUserLikes(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowUserLikes rename(String name) { - return new WorkflowUserLikes(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowUserLikes rename(Name name) { - return new WorkflowUserLikes(name, null); - } - - // ------------------------------------------------------------------------- - // Row2 type methods - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowVersion.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowVersion.java deleted file mode 100644 index 63ca76708ec..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowVersion.java +++ /dev/null @@ -1,163 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowVersionRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowVersion extends TableImpl { - - private static final long serialVersionUID = -1701791149; - - /** - * The reference instance of texera_db.workflow_version - */ - public static final WorkflowVersion WORKFLOW_VERSION = new WorkflowVersion(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowVersionRecord.class; - } - - /** - * The column texera_db.workflow_version.vid. - */ - public final TableField VID = createField(DSL.name("vid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).identity(true), this, ""); - - /** - * The column texera_db.workflow_version.wid. - */ - public final TableField WID = createField(DSL.name("wid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_version.content. - */ - public final TableField CONTENT = createField(DSL.name("content"), org.jooq.impl.SQLDataType.CLOB.nullable(false), this, ""); - - /** - * The column texera_db.workflow_version.creation_time. - */ - public final TableField CREATION_TIME = createField(DSL.name("creation_time"), org.jooq.impl.SQLDataType.TIMESTAMP.nullable(false).defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); - - /** - * Create a texera_db.workflow_version table reference - */ - public WorkflowVersion() { - this(DSL.name("workflow_version"), null); - } - - /** - * Create an aliased texera_db.workflow_version table reference - */ - public WorkflowVersion(String alias) { - this(DSL.name(alias), WORKFLOW_VERSION); - } - - /** - * Create an aliased texera_db.workflow_version table reference - */ - public WorkflowVersion(Name alias) { - this(alias, WORKFLOW_VERSION); - } - - private WorkflowVersion(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowVersion(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowVersion(Table child, ForeignKey key) { - super(child, key, WORKFLOW_VERSION); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_VERSION_PRIMARY, Indexes.WORKFLOW_VERSION_WID); - } - - @Override - public Identity getIdentity() { - return Keys.IDENTITY_WORKFLOW_VERSION; - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_VERSION_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_VERSION_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.WORKFLOW_VERSION_IBFK_1); - } - - public Workflow workflow() { - return new Workflow(this, Keys.WORKFLOW_VERSION_IBFK_1); - } - - @Override - public WorkflowVersion as(String alias) { - return new WorkflowVersion(DSL.name(alias), this); - } - - @Override - public WorkflowVersion as(Name alias) { - return new WorkflowVersion(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowVersion rename(String name) { - return new WorkflowVersion(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowVersion rename(Name name) { - return new WorkflowVersion(name, null); - } - - // ------------------------------------------------------------------------- - // Row4 type methods - // ------------------------------------------------------------------------- - - @Override - public Row4 fieldsRow() { - return (Row4) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowViewCount.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowViewCount.java deleted file mode 100644 index c34e7cc5efa..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/WorkflowViewCount.java +++ /dev/null @@ -1,147 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables; - - -import edu.uci.ics.texera.web.model.jooq.generated.Indexes; -import edu.uci.ics.texera.web.model.jooq.generated.Keys; -import edu.uci.ics.texera.web.model.jooq.generated.TexeraDb; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowViewCountRecord; -import org.jooq.*; -import org.jooq.impl.DSL; -import org.jooq.impl.TableImpl; -import org.jooq.types.UInteger; - -import java.util.Arrays; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowViewCount extends TableImpl { - - private static final long serialVersionUID = 89898965; - - /** - * The reference instance of texera_db.workflow_view_count - */ - public static final WorkflowViewCount WORKFLOW_VIEW_COUNT = new WorkflowViewCount(); - - /** - * The class holding records for this type - */ - @Override - public Class getRecordType() { - return WorkflowViewCountRecord.class; - } - - /** - * The column texera_db.workflow_view_count.wid. - */ - public final TableField WID = createField(DSL.name("wid"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false), this, ""); - - /** - * The column texera_db.workflow_view_count.view_count. - */ - public final TableField VIEW_COUNT = createField(DSL.name("view_count"), org.jooq.impl.SQLDataType.INTEGERUNSIGNED.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.INTEGERUNSIGNED)), this, ""); - - /** - * Create a texera_db.workflow_view_count table reference - */ - public WorkflowViewCount() { - this(DSL.name("workflow_view_count"), null); - } - - /** - * Create an aliased texera_db.workflow_view_count table reference - */ - public WorkflowViewCount(String alias) { - this(DSL.name(alias), WORKFLOW_VIEW_COUNT); - } - - /** - * Create an aliased texera_db.workflow_view_count table reference - */ - public WorkflowViewCount(Name alias) { - this(alias, WORKFLOW_VIEW_COUNT); - } - - private WorkflowViewCount(Name alias, Table aliased) { - this(alias, aliased, null); - } - - private WorkflowViewCount(Name alias, Table aliased, Field[] parameters) { - super(alias, null, aliased, parameters, DSL.comment("")); - } - - public WorkflowViewCount(Table child, ForeignKey key) { - super(child, key, WORKFLOW_VIEW_COUNT); - } - - @Override - public Schema getSchema() { - return TexeraDb.TEXERA_DB; - } - - @Override - public List getIndexes() { - return Arrays.asList(Indexes.WORKFLOW_VIEW_COUNT_PRIMARY); - } - - @Override - public UniqueKey getPrimaryKey() { - return Keys.KEY_WORKFLOW_VIEW_COUNT_PRIMARY; - } - - @Override - public List> getKeys() { - return Arrays.>asList(Keys.KEY_WORKFLOW_VIEW_COUNT_PRIMARY); - } - - @Override - public List> getReferences() { - return Arrays.>asList(Keys.WORKFLOW_VIEW_COUNT_IBFK_1); - } - - public Workflow workflow() { - return new Workflow(this, Keys.WORKFLOW_VIEW_COUNT_IBFK_1); - } - - @Override - public WorkflowViewCount as(String alias) { - return new WorkflowViewCount(DSL.name(alias), this); - } - - @Override - public WorkflowViewCount as(Name alias) { - return new WorkflowViewCount(alias, this); - } - - /** - * Rename this table - */ - @Override - public WorkflowViewCount rename(String name) { - return new WorkflowViewCount(DSL.name(name), null); - } - - /** - * Rename this table - */ - @Override - public WorkflowViewCount rename(Name name) { - return new WorkflowViewCount(name, null); - } - - // ------------------------------------------------------------------------- - // Row2 type methods - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetDao.java deleted file mode 100644 index db2f46c0dae..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetDao.java +++ /dev/null @@ -1,132 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.Dataset; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.DatasetRecord; -import org.jooq.Configuration; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetDao extends DAOImpl { - - /** - * Create a new DatasetDao without any configuration - */ - public DatasetDao() { - super(Dataset.DATASET, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Dataset.class); - } - - /** - * Create a new DatasetDao with an attached configuration - */ - public DatasetDao(Configuration configuration) { - super(Dataset.DATASET, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Dataset.class, configuration); - } - - @Override - public UInteger getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Dataset object) { - return object.getDid(); - } - - /** - * Fetch records that have did BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfDid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(Dataset.DATASET.DID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have did IN (values) - */ - public List fetchByDid(UInteger... values) { - return fetch(Dataset.DATASET.DID, values); - } - - /** - * Fetch a unique record that has did = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Dataset fetchOneByDid(UInteger value) { - return fetchOne(Dataset.DATASET.DID, value); - } - - /** - * Fetch records that have owner_uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfOwnerUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(Dataset.DATASET.OWNER_UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have owner_uid IN (values) - */ - public List fetchByOwnerUid(UInteger... values) { - return fetch(Dataset.DATASET.OWNER_UID, values); - } - - /** - * Fetch records that have name BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfName(String lowerInclusive, String upperInclusive) { - return fetchRange(Dataset.DATASET.NAME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have name IN (values) - */ - public List fetchByName(String... values) { - return fetch(Dataset.DATASET.NAME, values); - } - - /** - * Fetch records that have is_public BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfIsPublic(Byte lowerInclusive, Byte upperInclusive) { - return fetchRange(Dataset.DATASET.IS_PUBLIC, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have is_public IN (values) - */ - public List fetchByIsPublic(Byte... values) { - return fetch(Dataset.DATASET.IS_PUBLIC, values); - } - - /** - * Fetch records that have description BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfDescription(String lowerInclusive, String upperInclusive) { - return fetchRange(Dataset.DATASET.DESCRIPTION, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have description IN (values) - */ - public List fetchByDescription(String... values) { - return fetch(Dataset.DATASET.DESCRIPTION, values); - } - - /** - * Fetch records that have creation_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfCreationTime(Timestamp lowerInclusive, Timestamp upperInclusive) { - return fetchRange(Dataset.DATASET.CREATION_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have creation_time IN (values) - */ - public List fetchByCreationTime(Timestamp... values) { - return fetch(Dataset.DATASET.CREATION_TIME, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetUserAccessDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetUserAccessDao.java deleted file mode 100644 index c7a98a450a7..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetUserAccessDao.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.DatasetUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.DatasetUserAccess; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.DatasetUserAccessRecord; -import org.jooq.Configuration; -import org.jooq.Record2; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetUserAccessDao extends DAOImpl> { - - /** - * Create a new DatasetUserAccessDao without any configuration - */ - public DatasetUserAccessDao() { - super(DatasetUserAccess.DATASET_USER_ACCESS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.DatasetUserAccess.class); - } - - /** - * Create a new DatasetUserAccessDao with an attached configuration - */ - public DatasetUserAccessDao(Configuration configuration) { - super(DatasetUserAccess.DATASET_USER_ACCESS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.DatasetUserAccess.class, configuration); - } - - @Override - public Record2 getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.DatasetUserAccess object) { - return compositeKeyRecord(object.getDid(), object.getUid()); - } - - /** - * Fetch records that have did BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfDid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(DatasetUserAccess.DATASET_USER_ACCESS.DID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have did IN (values) - */ - public List fetchByDid(UInteger... values) { - return fetch(DatasetUserAccess.DATASET_USER_ACCESS.DID, values); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(DatasetUserAccess.DATASET_USER_ACCESS.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(DatasetUserAccess.DATASET_USER_ACCESS.UID, values); - } - - /** - * Fetch records that have privilege BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfPrivilege(DatasetUserAccessPrivilege lowerInclusive, DatasetUserAccessPrivilege upperInclusive) { - return fetchRange(DatasetUserAccess.DATASET_USER_ACCESS.PRIVILEGE, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have privilege IN (values) - */ - public List fetchByPrivilege(DatasetUserAccessPrivilege... values) { - return fetch(DatasetUserAccess.DATASET_USER_ACCESS.PRIVILEGE, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetVersionDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetVersionDao.java deleted file mode 100644 index 4c80be93e50..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/DatasetVersionDao.java +++ /dev/null @@ -1,132 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.DatasetVersion; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.DatasetVersionRecord; -import org.jooq.Configuration; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetVersionDao extends DAOImpl { - - /** - * Create a new DatasetVersionDao without any configuration - */ - public DatasetVersionDao() { - super(DatasetVersion.DATASET_VERSION, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.DatasetVersion.class); - } - - /** - * Create a new DatasetVersionDao with an attached configuration - */ - public DatasetVersionDao(Configuration configuration) { - super(DatasetVersion.DATASET_VERSION, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.DatasetVersion.class, configuration); - } - - @Override - public UInteger getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.DatasetVersion object) { - return object.getDvid(); - } - - /** - * Fetch records that have dvid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfDvid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(DatasetVersion.DATASET_VERSION.DVID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have dvid IN (values) - */ - public List fetchByDvid(UInteger... values) { - return fetch(DatasetVersion.DATASET_VERSION.DVID, values); - } - - /** - * Fetch a unique record that has dvid = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.DatasetVersion fetchOneByDvid(UInteger value) { - return fetchOne(DatasetVersion.DATASET_VERSION.DVID, value); - } - - /** - * Fetch records that have did BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfDid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(DatasetVersion.DATASET_VERSION.DID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have did IN (values) - */ - public List fetchByDid(UInteger... values) { - return fetch(DatasetVersion.DATASET_VERSION.DID, values); - } - - /** - * Fetch records that have creator_uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfCreatorUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(DatasetVersion.DATASET_VERSION.CREATOR_UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have creator_uid IN (values) - */ - public List fetchByCreatorUid(UInteger... values) { - return fetch(DatasetVersion.DATASET_VERSION.CREATOR_UID, values); - } - - /** - * Fetch records that have name BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfName(String lowerInclusive, String upperInclusive) { - return fetchRange(DatasetVersion.DATASET_VERSION.NAME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have name IN (values) - */ - public List fetchByName(String... values) { - return fetch(DatasetVersion.DATASET_VERSION.NAME, values); - } - - /** - * Fetch records that have version_hash BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfVersionHash(String lowerInclusive, String upperInclusive) { - return fetchRange(DatasetVersion.DATASET_VERSION.VERSION_HASH, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have version_hash IN (values) - */ - public List fetchByVersionHash(String... values) { - return fetch(DatasetVersion.DATASET_VERSION.VERSION_HASH, values); - } - - /** - * Fetch records that have creation_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfCreationTime(Timestamp lowerInclusive, Timestamp upperInclusive) { - return fetchRange(DatasetVersion.DATASET_VERSION.CREATION_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have creation_time IN (values) - */ - public List fetchByCreationTime(Timestamp... values) { - return fetch(DatasetVersion.DATASET_VERSION.CREATION_TIME, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/ProjectDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/ProjectDao.java deleted file mode 100644 index e4b886d0c18..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/ProjectDao.java +++ /dev/null @@ -1,132 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.Project; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.ProjectRecord; -import org.jooq.Configuration; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class ProjectDao extends DAOImpl { - - /** - * Create a new ProjectDao without any configuration - */ - public ProjectDao() { - super(Project.PROJECT, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Project.class); - } - - /** - * Create a new ProjectDao with an attached configuration - */ - public ProjectDao(Configuration configuration) { - super(Project.PROJECT, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Project.class, configuration); - } - - @Override - public UInteger getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Project object) { - return object.getPid(); - } - - /** - * Fetch records that have pid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfPid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(Project.PROJECT.PID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have pid IN (values) - */ - public List fetchByPid(UInteger... values) { - return fetch(Project.PROJECT.PID, values); - } - - /** - * Fetch a unique record that has pid = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Project fetchOneByPid(UInteger value) { - return fetchOne(Project.PROJECT.PID, value); - } - - /** - * Fetch records that have name BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfName(String lowerInclusive, String upperInclusive) { - return fetchRange(Project.PROJECT.NAME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have name IN (values) - */ - public List fetchByName(String... values) { - return fetch(Project.PROJECT.NAME, values); - } - - /** - * Fetch records that have description BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfDescription(String lowerInclusive, String upperInclusive) { - return fetchRange(Project.PROJECT.DESCRIPTION, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have description IN (values) - */ - public List fetchByDescription(String... values) { - return fetch(Project.PROJECT.DESCRIPTION, values); - } - - /** - * Fetch records that have owner_id BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfOwnerId(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(Project.PROJECT.OWNER_ID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have owner_id IN (values) - */ - public List fetchByOwnerId(UInteger... values) { - return fetch(Project.PROJECT.OWNER_ID, values); - } - - /** - * Fetch records that have creation_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfCreationTime(Timestamp lowerInclusive, Timestamp upperInclusive) { - return fetchRange(Project.PROJECT.CREATION_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have creation_time IN (values) - */ - public List fetchByCreationTime(Timestamp... values) { - return fetch(Project.PROJECT.CREATION_TIME, values); - } - - /** - * Fetch records that have color BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfColor(String lowerInclusive, String upperInclusive) { - return fetchRange(Project.PROJECT.COLOR, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have color IN (values) - */ - public List fetchByColor(String... values) { - return fetch(Project.PROJECT.COLOR, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/ProjectUserAccessDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/ProjectUserAccessDao.java deleted file mode 100644 index 303df37a86f..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/ProjectUserAccessDao.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.ProjectUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.ProjectUserAccess; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.ProjectUserAccessRecord; -import org.jooq.Configuration; -import org.jooq.Record2; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class ProjectUserAccessDao extends DAOImpl> { - - /** - * Create a new ProjectUserAccessDao without any configuration - */ - public ProjectUserAccessDao() { - super(ProjectUserAccess.PROJECT_USER_ACCESS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.ProjectUserAccess.class); - } - - /** - * Create a new ProjectUserAccessDao with an attached configuration - */ - public ProjectUserAccessDao(Configuration configuration) { - super(ProjectUserAccess.PROJECT_USER_ACCESS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.ProjectUserAccess.class, configuration); - } - - @Override - public Record2 getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.ProjectUserAccess object) { - return compositeKeyRecord(object.getUid(), object.getPid()); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(ProjectUserAccess.PROJECT_USER_ACCESS.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(ProjectUserAccess.PROJECT_USER_ACCESS.UID, values); - } - - /** - * Fetch records that have pid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfPid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(ProjectUserAccess.PROJECT_USER_ACCESS.PID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have pid IN (values) - */ - public List fetchByPid(UInteger... values) { - return fetch(ProjectUserAccess.PROJECT_USER_ACCESS.PID, values); - } - - /** - * Fetch records that have privilege BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfPrivilege(ProjectUserAccessPrivilege lowerInclusive, ProjectUserAccessPrivilege upperInclusive) { - return fetchRange(ProjectUserAccess.PROJECT_USER_ACCESS.PRIVILEGE, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have privilege IN (values) - */ - public List fetchByPrivilege(ProjectUserAccessPrivilege... values) { - return fetch(ProjectUserAccess.PROJECT_USER_ACCESS.PRIVILEGE, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/PublicProjectDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/PublicProjectDao.java deleted file mode 100644 index 9379f81aec8..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/PublicProjectDao.java +++ /dev/null @@ -1,75 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.PublicProject; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.PublicProjectRecord; -import org.jooq.Configuration; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class PublicProjectDao extends DAOImpl { - - /** - * Create a new PublicProjectDao without any configuration - */ - public PublicProjectDao() { - super(PublicProject.PUBLIC_PROJECT, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.PublicProject.class); - } - - /** - * Create a new PublicProjectDao with an attached configuration - */ - public PublicProjectDao(Configuration configuration) { - super(PublicProject.PUBLIC_PROJECT, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.PublicProject.class, configuration); - } - - @Override - public UInteger getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.PublicProject object) { - return object.getPid(); - } - - /** - * Fetch records that have pid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfPid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(PublicProject.PUBLIC_PROJECT.PID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have pid IN (values) - */ - public List fetchByPid(UInteger... values) { - return fetch(PublicProject.PUBLIC_PROJECT.PID, values); - } - - /** - * Fetch a unique record that has pid = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.PublicProject fetchOneByPid(UInteger value) { - return fetchOne(PublicProject.PUBLIC_PROJECT.PID, value); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(PublicProject.PUBLIC_PROJECT.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(PublicProject.PUBLIC_PROJECT.UID, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/UserConfigDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/UserConfigDao.java deleted file mode 100644 index f11f8c58792..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/UserConfigDao.java +++ /dev/null @@ -1,83 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.UserConfig; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.UserConfigRecord; -import org.jooq.Configuration; -import org.jooq.Record2; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class UserConfigDao extends DAOImpl> { - - /** - * Create a new UserConfigDao without any configuration - */ - public UserConfigDao() { - super(UserConfig.USER_CONFIG, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.UserConfig.class); - } - - /** - * Create a new UserConfigDao with an attached configuration - */ - public UserConfigDao(Configuration configuration) { - super(UserConfig.USER_CONFIG, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.UserConfig.class, configuration); - } - - @Override - public Record2 getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.UserConfig object) { - return compositeKeyRecord(object.getUid(), object.getKey()); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(UserConfig.USER_CONFIG.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(UserConfig.USER_CONFIG.UID, values); - } - - /** - * Fetch records that have key BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfKey(String lowerInclusive, String upperInclusive) { - return fetchRange(UserConfig.USER_CONFIG.KEY, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have key IN (values) - */ - public List fetchByKey(String... values) { - return fetch(UserConfig.USER_CONFIG.KEY, values); - } - - /** - * Fetch records that have value BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfValue(String lowerInclusive, String upperInclusive) { - return fetchRange(UserConfig.USER_CONFIG.VALUE, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have value IN (values) - */ - public List fetchByValue(String... values) { - return fetch(UserConfig.USER_CONFIG.VALUE, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/UserDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/UserDao.java deleted file mode 100644 index 450d728d7bd..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/UserDao.java +++ /dev/null @@ -1,160 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole; -import edu.uci.ics.texera.web.model.jooq.generated.tables.User; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.UserRecord; -import org.jooq.Configuration; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class UserDao extends DAOImpl { - - /** - * Create a new UserDao without any configuration - */ - public UserDao() { - super(User.USER, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User.class); - } - - /** - * Create a new UserDao with an attached configuration - */ - public UserDao(Configuration configuration) { - super(User.USER, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User.class, configuration); - } - - @Override - public UInteger getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User object) { - return object.getUid(); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(User.USER.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(User.USER.UID, values); - } - - /** - * Fetch a unique record that has uid = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User fetchOneByUid(UInteger value) { - return fetchOne(User.USER.UID, value); - } - - /** - * Fetch records that have name BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfName(String lowerInclusive, String upperInclusive) { - return fetchRange(User.USER.NAME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have name IN (values) - */ - public List fetchByName(String... values) { - return fetch(User.USER.NAME, values); - } - - /** - * Fetch records that have email BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfEmail(String lowerInclusive, String upperInclusive) { - return fetchRange(User.USER.EMAIL, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have email IN (values) - */ - public List fetchByEmail(String... values) { - return fetch(User.USER.EMAIL, values); - } - - /** - * Fetch a unique record that has email = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User fetchOneByEmail(String value) { - return fetchOne(User.USER.EMAIL, value); - } - - /** - * Fetch records that have password BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfPassword(String lowerInclusive, String upperInclusive) { - return fetchRange(User.USER.PASSWORD, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have password IN (values) - */ - public List fetchByPassword(String... values) { - return fetch(User.USER.PASSWORD, values); - } - - /** - * Fetch records that have google_id BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfGoogleId(String lowerInclusive, String upperInclusive) { - return fetchRange(User.USER.GOOGLE_ID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have google_id IN (values) - */ - public List fetchByGoogleId(String... values) { - return fetch(User.USER.GOOGLE_ID, values); - } - - /** - * Fetch a unique record that has google_id = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User fetchOneByGoogleId(String value) { - return fetchOne(User.USER.GOOGLE_ID, value); - } - - /** - * Fetch records that have role BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfRole(UserRole lowerInclusive, UserRole upperInclusive) { - return fetchRange(User.USER.ROLE, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have role IN (values) - */ - public List fetchByRole(UserRole... values) { - return fetch(User.USER.ROLE, values); - } - - /** - * Fetch records that have google_avatar BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfGoogleAvatar(String lowerInclusive, String upperInclusive) { - return fetchRange(User.USER.GOOGLE_AVATAR, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have google_avatar IN (values) - */ - public List fetchByGoogleAvatar(String... values) { - return fetch(User.USER.GOOGLE_AVATAR, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowDao.java deleted file mode 100644 index e79c7a834f6..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowDao.java +++ /dev/null @@ -1,146 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.Workflow; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowRecord; -import org.jooq.Configuration; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowDao extends DAOImpl { - - /** - * Create a new WorkflowDao without any configuration - */ - public WorkflowDao() { - super(Workflow.WORKFLOW, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Workflow.class); - } - - /** - * Create a new WorkflowDao with an attached configuration - */ - public WorkflowDao(Configuration configuration) { - super(Workflow.WORKFLOW, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Workflow.class, configuration); - } - - @Override - public UInteger getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Workflow object) { - return object.getWid(); - } - - /** - * Fetch records that have name BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfName(String lowerInclusive, String upperInclusive) { - return fetchRange(Workflow.WORKFLOW.NAME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have name IN (values) - */ - public List fetchByName(String... values) { - return fetch(Workflow.WORKFLOW.NAME, values); - } - - /** - * Fetch records that have description BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfDescription(String lowerInclusive, String upperInclusive) { - return fetchRange(Workflow.WORKFLOW.DESCRIPTION, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have description IN (values) - */ - public List fetchByDescription(String... values) { - return fetch(Workflow.WORKFLOW.DESCRIPTION, values); - } - - /** - * Fetch records that have wid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfWid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(Workflow.WORKFLOW.WID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have wid IN (values) - */ - public List fetchByWid(UInteger... values) { - return fetch(Workflow.WORKFLOW.WID, values); - } - - /** - * Fetch a unique record that has wid = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Workflow fetchOneByWid(UInteger value) { - return fetchOne(Workflow.WORKFLOW.WID, value); - } - - /** - * Fetch records that have content BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfContent(String lowerInclusive, String upperInclusive) { - return fetchRange(Workflow.WORKFLOW.CONTENT, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have content IN (values) - */ - public List fetchByContent(String... values) { - return fetch(Workflow.WORKFLOW.CONTENT, values); - } - - /** - * Fetch records that have creation_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfCreationTime(Timestamp lowerInclusive, Timestamp upperInclusive) { - return fetchRange(Workflow.WORKFLOW.CREATION_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have creation_time IN (values) - */ - public List fetchByCreationTime(Timestamp... values) { - return fetch(Workflow.WORKFLOW.CREATION_TIME, values); - } - - /** - * Fetch records that have last_modified_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfLastModifiedTime(Timestamp lowerInclusive, Timestamp upperInclusive) { - return fetchRange(Workflow.WORKFLOW.LAST_MODIFIED_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have last_modified_time IN (values) - */ - public List fetchByLastModifiedTime(Timestamp... values) { - return fetch(Workflow.WORKFLOW.LAST_MODIFIED_TIME, values); - } - - /** - * Fetch records that have is_published BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfIsPublished(Byte lowerInclusive, Byte upperInclusive) { - return fetchRange(Workflow.WORKFLOW.IS_PUBLISHED, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have is_published IN (values) - */ - public List fetchByIsPublished(Byte... values) { - return fetch(Workflow.WORKFLOW.IS_PUBLISHED, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowExecutionsDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowExecutionsDao.java deleted file mode 100644 index 195762f49d7..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowExecutionsDao.java +++ /dev/null @@ -1,202 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowExecutions; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowExecutionsRecord; -import org.jooq.Configuration; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowExecutionsDao extends DAOImpl { - - /** - * Create a new WorkflowExecutionsDao without any configuration - */ - public WorkflowExecutionsDao() { - super(WorkflowExecutions.WORKFLOW_EXECUTIONS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowExecutions.class); - } - - /** - * Create a new WorkflowExecutionsDao with an attached configuration - */ - public WorkflowExecutionsDao(Configuration configuration) { - super(WorkflowExecutions.WORKFLOW_EXECUTIONS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowExecutions.class, configuration); - } - - @Override - public UInteger getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowExecutions object) { - return object.getEid(); - } - - /** - * Fetch records that have eid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfEid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.EID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have eid IN (values) - */ - public List fetchByEid(UInteger... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.EID, values); - } - - /** - * Fetch a unique record that has eid = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowExecutions fetchOneByEid(UInteger value) { - return fetchOne(WorkflowExecutions.WORKFLOW_EXECUTIONS.EID, value); - } - - /** - * Fetch records that have vid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfVid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.VID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have vid IN (values) - */ - public List fetchByVid(UInteger... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.VID, values); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.UID, values); - } - - /** - * Fetch records that have status BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfStatus(Byte lowerInclusive, Byte upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.STATUS, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have status IN (values) - */ - public List fetchByStatus(Byte... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.STATUS, values); - } - - /** - * Fetch records that have result BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfResult(String lowerInclusive, String upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.RESULT, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have result IN (values) - */ - public List fetchByResult(String... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.RESULT, values); - } - - /** - * Fetch records that have starting_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfStartingTime(Timestamp lowerInclusive, Timestamp upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.STARTING_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have starting_time IN (values) - */ - public List fetchByStartingTime(Timestamp... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.STARTING_TIME, values); - } - - /** - * Fetch records that have last_update_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfLastUpdateTime(Timestamp lowerInclusive, Timestamp upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.LAST_UPDATE_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have last_update_time IN (values) - */ - public List fetchByLastUpdateTime(Timestamp... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.LAST_UPDATE_TIME, values); - } - - /** - * Fetch records that have bookmarked BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfBookmarked(Byte lowerInclusive, Byte upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.BOOKMARKED, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have bookmarked IN (values) - */ - public List fetchByBookmarked(Byte... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.BOOKMARKED, values); - } - - /** - * Fetch records that have name BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfName(String lowerInclusive, String upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.NAME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have name IN (values) - */ - public List fetchByName(String... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.NAME, values); - } - - /** - * Fetch records that have environment_version BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfEnvironmentVersion(String lowerInclusive, String upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.ENVIRONMENT_VERSION, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have environment_version IN (values) - */ - public List fetchByEnvironmentVersion(String... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.ENVIRONMENT_VERSION, values); - } - - /** - * Fetch records that have log_location BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfLogLocation(String lowerInclusive, String upperInclusive) { - return fetchRange(WorkflowExecutions.WORKFLOW_EXECUTIONS.LOG_LOCATION, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have log_location IN (values) - */ - public List fetchByLogLocation(String... values) { - return fetch(WorkflowExecutions.WORKFLOW_EXECUTIONS.LOG_LOCATION, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowOfProjectDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowOfProjectDao.java deleted file mode 100644 index 19952ed3d68..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowOfProjectDao.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowOfProject; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowOfProjectRecord; -import org.jooq.Configuration; -import org.jooq.Record2; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowOfProjectDao extends DAOImpl> { - - /** - * Create a new WorkflowOfProjectDao without any configuration - */ - public WorkflowOfProjectDao() { - super(WorkflowOfProject.WORKFLOW_OF_PROJECT, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowOfProject.class); - } - - /** - * Create a new WorkflowOfProjectDao with an attached configuration - */ - public WorkflowOfProjectDao(Configuration configuration) { - super(WorkflowOfProject.WORKFLOW_OF_PROJECT, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowOfProject.class, configuration); - } - - @Override - public Record2 getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowOfProject object) { - return compositeKeyRecord(object.getWid(), object.getPid()); - } - - /** - * Fetch records that have wid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfWid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowOfProject.WORKFLOW_OF_PROJECT.WID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have wid IN (values) - */ - public List fetchByWid(UInteger... values) { - return fetch(WorkflowOfProject.WORKFLOW_OF_PROJECT.WID, values); - } - - /** - * Fetch records that have pid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfPid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowOfProject.WORKFLOW_OF_PROJECT.PID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have pid IN (values) - */ - public List fetchByPid(UInteger... values) { - return fetch(WorkflowOfProject.WORKFLOW_OF_PROJECT.PID, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowOfUserDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowOfUserDao.java deleted file mode 100644 index a63d642d7b7..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowOfUserDao.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowOfUser; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowOfUserRecord; -import org.jooq.Configuration; -import org.jooq.Record2; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowOfUserDao extends DAOImpl> { - - /** - * Create a new WorkflowOfUserDao without any configuration - */ - public WorkflowOfUserDao() { - super(WorkflowOfUser.WORKFLOW_OF_USER, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowOfUser.class); - } - - /** - * Create a new WorkflowOfUserDao with an attached configuration - */ - public WorkflowOfUserDao(Configuration configuration) { - super(WorkflowOfUser.WORKFLOW_OF_USER, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowOfUser.class, configuration); - } - - @Override - public Record2 getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowOfUser object) { - return compositeKeyRecord(object.getUid(), object.getWid()); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowOfUser.WORKFLOW_OF_USER.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(WorkflowOfUser.WORKFLOW_OF_USER.UID, values); - } - - /** - * Fetch records that have wid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfWid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowOfUser.WORKFLOW_OF_USER.WID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have wid IN (values) - */ - public List fetchByWid(UInteger... values) { - return fetch(WorkflowOfUser.WORKFLOW_OF_USER.WID, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowRuntimeStatisticsDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowRuntimeStatisticsDao.java deleted file mode 100644 index 4aad3529311..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowRuntimeStatisticsDao.java +++ /dev/null @@ -1,197 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowRuntimeStatistics; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowRuntimeStatisticsRecord; -import org.jooq.Configuration; -import org.jooq.Record4; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; -import org.jooq.types.ULong; - -import java.sql.Timestamp; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowRuntimeStatisticsDao extends DAOImpl> { - - /** - * Create a new WorkflowRuntimeStatisticsDao without any configuration - */ - public WorkflowRuntimeStatisticsDao() { - super(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowRuntimeStatistics.class); - } - - /** - * Create a new WorkflowRuntimeStatisticsDao with an attached configuration - */ - public WorkflowRuntimeStatisticsDao(Configuration configuration) { - super(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowRuntimeStatistics.class, configuration); - } - - @Override - public Record4 getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowRuntimeStatistics object) { - return compositeKeyRecord(object.getWorkflowId(), object.getExecutionId(), object.getOperatorId(), object.getTime()); - } - - /** - * Fetch records that have workflow_id BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfWorkflowId(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.WORKFLOW_ID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have workflow_id IN (values) - */ - public List fetchByWorkflowId(UInteger... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.WORKFLOW_ID, values); - } - - /** - * Fetch records that have execution_id BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfExecutionId(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.EXECUTION_ID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have execution_id IN (values) - */ - public List fetchByExecutionId(UInteger... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.EXECUTION_ID, values); - } - - /** - * Fetch records that have operator_id BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfOperatorId(String lowerInclusive, String upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.OPERATOR_ID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have operator_id IN (values) - */ - public List fetchByOperatorId(String... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.OPERATOR_ID, values); - } - - /** - * Fetch records that have time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfTime(Timestamp lowerInclusive, Timestamp upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have time IN (values) - */ - public List fetchByTime(Timestamp... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.TIME, values); - } - - /** - * Fetch records that have input_tuple_cnt BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfInputTupleCnt(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.INPUT_TUPLE_CNT, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have input_tuple_cnt IN (values) - */ - public List fetchByInputTupleCnt(UInteger... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.INPUT_TUPLE_CNT, values); - } - - /** - * Fetch records that have output_tuple_cnt BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfOutputTupleCnt(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.OUTPUT_TUPLE_CNT, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have output_tuple_cnt IN (values) - */ - public List fetchByOutputTupleCnt(UInteger... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.OUTPUT_TUPLE_CNT, values); - } - - /** - * Fetch records that have status BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfStatus(Byte lowerInclusive, Byte upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.STATUS, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have status IN (values) - */ - public List fetchByStatus(Byte... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.STATUS, values); - } - - /** - * Fetch records that have data_processing_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfDataProcessingTime(ULong lowerInclusive, ULong upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.DATA_PROCESSING_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have data_processing_time IN (values) - */ - public List fetchByDataProcessingTime(ULong... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.DATA_PROCESSING_TIME, values); - } - - /** - * Fetch records that have control_processing_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfControlProcessingTime(ULong lowerInclusive, ULong upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.CONTROL_PROCESSING_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have control_processing_time IN (values) - */ - public List fetchByControlProcessingTime(ULong... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.CONTROL_PROCESSING_TIME, values); - } - - /** - * Fetch records that have idle_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfIdleTime(ULong lowerInclusive, ULong upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.IDLE_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have idle_time IN (values) - */ - public List fetchByIdleTime(ULong... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.IDLE_TIME, values); - } - - /** - * Fetch records that have num_workers BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfNumWorkers(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.NUM_WORKERS, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have num_workers IN (values) - */ - public List fetchByNumWorkers(UInteger... values) { - return fetch(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.NUM_WORKERS, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserAccessDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserAccessDao.java deleted file mode 100644 index 33691f02f5c..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserAccessDao.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.WorkflowUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserAccess; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowUserAccessRecord; -import org.jooq.Configuration; -import org.jooq.Record2; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserAccessDao extends DAOImpl> { - - /** - * Create a new WorkflowUserAccessDao without any configuration - */ - public WorkflowUserAccessDao() { - super(WorkflowUserAccess.WORKFLOW_USER_ACCESS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserAccess.class); - } - - /** - * Create a new WorkflowUserAccessDao with an attached configuration - */ - public WorkflowUserAccessDao(Configuration configuration) { - super(WorkflowUserAccess.WORKFLOW_USER_ACCESS, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserAccess.class, configuration); - } - - @Override - public Record2 getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserAccess object) { - return compositeKeyRecord(object.getUid(), object.getWid()); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowUserAccess.WORKFLOW_USER_ACCESS.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(WorkflowUserAccess.WORKFLOW_USER_ACCESS.UID, values); - } - - /** - * Fetch records that have wid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfWid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowUserAccess.WORKFLOW_USER_ACCESS.WID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have wid IN (values) - */ - public List fetchByWid(UInteger... values) { - return fetch(WorkflowUserAccess.WORKFLOW_USER_ACCESS.WID, values); - } - - /** - * Fetch records that have privilege BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfPrivilege(WorkflowUserAccessPrivilege lowerInclusive, WorkflowUserAccessPrivilege upperInclusive) { - return fetchRange(WorkflowUserAccess.WORKFLOW_USER_ACCESS.PRIVILEGE, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have privilege IN (values) - */ - public List fetchByPrivilege(WorkflowUserAccessPrivilege... values) { - return fetch(WorkflowUserAccess.WORKFLOW_USER_ACCESS.PRIVILEGE, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserClonesDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserClonesDao.java deleted file mode 100644 index 8336a6132f8..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserClonesDao.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserClones; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowUserClonesRecord; -import org.jooq.Configuration; -import org.jooq.Record2; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserClonesDao extends DAOImpl> { - - /** - * Create a new WorkflowUserClonesDao without any configuration - */ - public WorkflowUserClonesDao() { - super(WorkflowUserClones.WORKFLOW_USER_CLONES, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserClones.class); - } - - /** - * Create a new WorkflowUserClonesDao with an attached configuration - */ - public WorkflowUserClonesDao(Configuration configuration) { - super(WorkflowUserClones.WORKFLOW_USER_CLONES, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserClones.class, configuration); - } - - @Override - public Record2 getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserClones object) { - return compositeKeyRecord(object.getUid(), object.getWid()); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowUserClones.WORKFLOW_USER_CLONES.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(WorkflowUserClones.WORKFLOW_USER_CLONES.UID, values); - } - - /** - * Fetch records that have wid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfWid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowUserClones.WORKFLOW_USER_CLONES.WID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have wid IN (values) - */ - public List fetchByWid(UInteger... values) { - return fetch(WorkflowUserClones.WORKFLOW_USER_CLONES.WID, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserLikesDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserLikesDao.java deleted file mode 100644 index 2850bcb90c9..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowUserLikesDao.java +++ /dev/null @@ -1,69 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserLikes; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowUserLikesRecord; -import org.jooq.Configuration; -import org.jooq.Record2; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserLikesDao extends DAOImpl> { - - /** - * Create a new WorkflowUserLikesDao without any configuration - */ - public WorkflowUserLikesDao() { - super(WorkflowUserLikes.WORKFLOW_USER_LIKES, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserLikes.class); - } - - /** - * Create a new WorkflowUserLikesDao with an attached configuration - */ - public WorkflowUserLikesDao(Configuration configuration) { - super(WorkflowUserLikes.WORKFLOW_USER_LIKES, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserLikes.class, configuration); - } - - @Override - public Record2 getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserLikes object) { - return compositeKeyRecord(object.getUid(), object.getWid()); - } - - /** - * Fetch records that have uid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfUid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowUserLikes.WORKFLOW_USER_LIKES.UID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have uid IN (values) - */ - public List fetchByUid(UInteger... values) { - return fetch(WorkflowUserLikes.WORKFLOW_USER_LIKES.UID, values); - } - - /** - * Fetch records that have wid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfWid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowUserLikes.WORKFLOW_USER_LIKES.WID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have wid IN (values) - */ - public List fetchByWid(UInteger... values) { - return fetch(WorkflowUserLikes.WORKFLOW_USER_LIKES.WID, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowVersionDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowVersionDao.java deleted file mode 100644 index b86ce7cde36..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowVersionDao.java +++ /dev/null @@ -1,104 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowVersion; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowVersionRecord; -import org.jooq.Configuration; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowVersionDao extends DAOImpl { - - /** - * Create a new WorkflowVersionDao without any configuration - */ - public WorkflowVersionDao() { - super(WorkflowVersion.WORKFLOW_VERSION, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowVersion.class); - } - - /** - * Create a new WorkflowVersionDao with an attached configuration - */ - public WorkflowVersionDao(Configuration configuration) { - super(WorkflowVersion.WORKFLOW_VERSION, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowVersion.class, configuration); - } - - @Override - public UInteger getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowVersion object) { - return object.getVid(); - } - - /** - * Fetch records that have vid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfVid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowVersion.WORKFLOW_VERSION.VID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have vid IN (values) - */ - public List fetchByVid(UInteger... values) { - return fetch(WorkflowVersion.WORKFLOW_VERSION.VID, values); - } - - /** - * Fetch a unique record that has vid = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowVersion fetchOneByVid(UInteger value) { - return fetchOne(WorkflowVersion.WORKFLOW_VERSION.VID, value); - } - - /** - * Fetch records that have wid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfWid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowVersion.WORKFLOW_VERSION.WID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have wid IN (values) - */ - public List fetchByWid(UInteger... values) { - return fetch(WorkflowVersion.WORKFLOW_VERSION.WID, values); - } - - /** - * Fetch records that have content BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfContent(String lowerInclusive, String upperInclusive) { - return fetchRange(WorkflowVersion.WORKFLOW_VERSION.CONTENT, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have content IN (values) - */ - public List fetchByContent(String... values) { - return fetch(WorkflowVersion.WORKFLOW_VERSION.CONTENT, values); - } - - /** - * Fetch records that have creation_time BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfCreationTime(Timestamp lowerInclusive, Timestamp upperInclusive) { - return fetchRange(WorkflowVersion.WORKFLOW_VERSION.CREATION_TIME, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have creation_time IN (values) - */ - public List fetchByCreationTime(Timestamp... values) { - return fetch(WorkflowVersion.WORKFLOW_VERSION.CREATION_TIME, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowViewCountDao.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowViewCountDao.java deleted file mode 100644 index 1f7bf7f06e4..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/daos/WorkflowViewCountDao.java +++ /dev/null @@ -1,75 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.daos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowViewCount; -import edu.uci.ics.texera.web.model.jooq.generated.tables.records.WorkflowViewCountRecord; -import org.jooq.Configuration; -import org.jooq.impl.DAOImpl; -import org.jooq.types.UInteger; - -import java.util.List; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowViewCountDao extends DAOImpl { - - /** - * Create a new WorkflowViewCountDao without any configuration - */ - public WorkflowViewCountDao() { - super(WorkflowViewCount.WORKFLOW_VIEW_COUNT, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowViewCount.class); - } - - /** - * Create a new WorkflowViewCountDao with an attached configuration - */ - public WorkflowViewCountDao(Configuration configuration) { - super(WorkflowViewCount.WORKFLOW_VIEW_COUNT, edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowViewCount.class, configuration); - } - - @Override - public UInteger getId(edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowViewCount object) { - return object.getWid(); - } - - /** - * Fetch records that have wid BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfWid(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowViewCount.WORKFLOW_VIEW_COUNT.WID, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have wid IN (values) - */ - public List fetchByWid(UInteger... values) { - return fetch(WorkflowViewCount.WORKFLOW_VIEW_COUNT.WID, values); - } - - /** - * Fetch a unique record that has wid = value - */ - public edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowViewCount fetchOneByWid(UInteger value) { - return fetchOne(WorkflowViewCount.WORKFLOW_VIEW_COUNT.WID, value); - } - - /** - * Fetch records that have view_count BETWEEN lowerInclusive AND upperInclusive - */ - public List fetchRangeOfViewCount(UInteger lowerInclusive, UInteger upperInclusive) { - return fetchRange(WorkflowViewCount.WORKFLOW_VIEW_COUNT.VIEW_COUNT, lowerInclusive, upperInclusive); - } - - /** - * Fetch records that have view_count IN (values) - */ - public List fetchByViewCount(UInteger... values) { - return fetch(WorkflowViewCount.WORKFLOW_VIEW_COUNT.VIEW_COUNT, values); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDataset.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDataset.java deleted file mode 100644 index 43a968213b3..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDataset.java +++ /dev/null @@ -1,92 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IDataset extends Serializable { - - /** - * Setter for texera_db.dataset.did. - */ - public void setDid(UInteger value); - - /** - * Getter for texera_db.dataset.did. - */ - public UInteger getDid(); - - /** - * Setter for texera_db.dataset.owner_uid. - */ - public void setOwnerUid(UInteger value); - - /** - * Getter for texera_db.dataset.owner_uid. - */ - public UInteger getOwnerUid(); - - /** - * Setter for texera_db.dataset.name. - */ - public void setName(String value); - - /** - * Getter for texera_db.dataset.name. - */ - public String getName(); - - /** - * Setter for texera_db.dataset.is_public. - */ - public void setIsPublic(Byte value); - - /** - * Getter for texera_db.dataset.is_public. - */ - public Byte getIsPublic(); - - /** - * Setter for texera_db.dataset.description. - */ - public void setDescription(String value); - - /** - * Getter for texera_db.dataset.description. - */ - public String getDescription(); - - /** - * Setter for texera_db.dataset.creation_time. - */ - public void setCreationTime(Timestamp value); - - /** - * Getter for texera_db.dataset.creation_time. - */ - public Timestamp getCreationTime(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IDataset - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IDataset from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IDataset - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDatasetUserAccess.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDatasetUserAccess.java deleted file mode 100644 index 7b6a5873811..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDatasetUserAccess.java +++ /dev/null @@ -1,62 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.DatasetUserAccessPrivilege; -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IDatasetUserAccess extends Serializable { - - /** - * Setter for texera_db.dataset_user_access.did. - */ - public void setDid(UInteger value); - - /** - * Getter for texera_db.dataset_user_access.did. - */ - public UInteger getDid(); - - /** - * Setter for texera_db.dataset_user_access.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.dataset_user_access.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.dataset_user_access.privilege. - */ - public void setPrivilege(DatasetUserAccessPrivilege value); - - /** - * Getter for texera_db.dataset_user_access.privilege. - */ - public DatasetUserAccessPrivilege getPrivilege(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IDatasetUserAccess - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IDatasetUserAccess from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IDatasetUserAccess - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDatasetVersion.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDatasetVersion.java deleted file mode 100644 index 8ae17ed9fc2..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IDatasetVersion.java +++ /dev/null @@ -1,92 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IDatasetVersion extends Serializable { - - /** - * Setter for texera_db.dataset_version.dvid. - */ - public void setDvid(UInteger value); - - /** - * Getter for texera_db.dataset_version.dvid. - */ - public UInteger getDvid(); - - /** - * Setter for texera_db.dataset_version.did. - */ - public void setDid(UInteger value); - - /** - * Getter for texera_db.dataset_version.did. - */ - public UInteger getDid(); - - /** - * Setter for texera_db.dataset_version.creator_uid. - */ - public void setCreatorUid(UInteger value); - - /** - * Getter for texera_db.dataset_version.creator_uid. - */ - public UInteger getCreatorUid(); - - /** - * Setter for texera_db.dataset_version.name. - */ - public void setName(String value); - - /** - * Getter for texera_db.dataset_version.name. - */ - public String getName(); - - /** - * Setter for texera_db.dataset_version.version_hash. - */ - public void setVersionHash(String value); - - /** - * Getter for texera_db.dataset_version.version_hash. - */ - public String getVersionHash(); - - /** - * Setter for texera_db.dataset_version.creation_time. - */ - public void setCreationTime(Timestamp value); - - /** - * Getter for texera_db.dataset_version.creation_time. - */ - public Timestamp getCreationTime(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IDatasetVersion - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IDatasetVersion from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IDatasetVersion - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IProject.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IProject.java deleted file mode 100644 index 9a7ef8df403..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IProject.java +++ /dev/null @@ -1,92 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IProject extends Serializable { - - /** - * Setter for texera_db.project.pid. - */ - public void setPid(UInteger value); - - /** - * Getter for texera_db.project.pid. - */ - public UInteger getPid(); - - /** - * Setter for texera_db.project.name. - */ - public void setName(String value); - - /** - * Getter for texera_db.project.name. - */ - public String getName(); - - /** - * Setter for texera_db.project.description. - */ - public void setDescription(String value); - - /** - * Getter for texera_db.project.description. - */ - public String getDescription(); - - /** - * Setter for texera_db.project.owner_id. - */ - public void setOwnerId(UInteger value); - - /** - * Getter for texera_db.project.owner_id. - */ - public UInteger getOwnerId(); - - /** - * Setter for texera_db.project.creation_time. - */ - public void setCreationTime(Timestamp value); - - /** - * Getter for texera_db.project.creation_time. - */ - public Timestamp getCreationTime(); - - /** - * Setter for texera_db.project.color. - */ - public void setColor(String value); - - /** - * Getter for texera_db.project.color. - */ - public String getColor(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IProject - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IProject from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IProject - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IProjectUserAccess.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IProjectUserAccess.java deleted file mode 100644 index 5816ab90e09..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IProjectUserAccess.java +++ /dev/null @@ -1,62 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.ProjectUserAccessPrivilege; -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IProjectUserAccess extends Serializable { - - /** - * Setter for texera_db.project_user_access.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.project_user_access.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.project_user_access.pid. - */ - public void setPid(UInteger value); - - /** - * Getter for texera_db.project_user_access.pid. - */ - public UInteger getPid(); - - /** - * Setter for texera_db.project_user_access.privilege. - */ - public void setPrivilege(ProjectUserAccessPrivilege value); - - /** - * Getter for texera_db.project_user_access.privilege. - */ - public ProjectUserAccessPrivilege getPrivilege(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IProjectUserAccess - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IProjectUserAccess from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IProjectUserAccess - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IPublicProject.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IPublicProject.java deleted file mode 100644 index 7bda5c24c99..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IPublicProject.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IPublicProject extends Serializable { - - /** - * Setter for texera_db.public_project.pid. - */ - public void setPid(UInteger value); - - /** - * Getter for texera_db.public_project.pid. - */ - public UInteger getPid(); - - /** - * Setter for texera_db.public_project.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.public_project.uid. - */ - public UInteger getUid(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IPublicProject - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IPublicProject from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IPublicProject - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IUser.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IUser.java deleted file mode 100644 index 6efafbd9108..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IUser.java +++ /dev/null @@ -1,102 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole; -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IUser extends Serializable { - - /** - * Setter for texera_db.user.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.user.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.user.name. - */ - public void setName(String value); - - /** - * Getter for texera_db.user.name. - */ - public String getName(); - - /** - * Setter for texera_db.user.email. - */ - public void setEmail(String value); - - /** - * Getter for texera_db.user.email. - */ - public String getEmail(); - - /** - * Setter for texera_db.user.password. - */ - public void setPassword(String value); - - /** - * Getter for texera_db.user.password. - */ - public String getPassword(); - - /** - * Setter for texera_db.user.google_id. - */ - public void setGoogleId(String value); - - /** - * Getter for texera_db.user.google_id. - */ - public String getGoogleId(); - - /** - * Setter for texera_db.user.role. - */ - public void setRole(UserRole value); - - /** - * Getter for texera_db.user.role. - */ - public UserRole getRole(); - - /** - * Setter for texera_db.user.google_avatar. - */ - public void setGoogleAvatar(String value); - - /** - * Getter for texera_db.user.google_avatar. - */ - public String getGoogleAvatar(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IUser - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IUser from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IUser - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IUserConfig.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IUserConfig.java deleted file mode 100644 index 2bd1780741d..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IUserConfig.java +++ /dev/null @@ -1,61 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IUserConfig extends Serializable { - - /** - * Setter for texera_db.user_config.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.user_config.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.user_config.key. - */ - public void setKey(String value); - - /** - * Getter for texera_db.user_config.key. - */ - public String getKey(); - - /** - * Setter for texera_db.user_config.value. - */ - public void setValue(String value); - - /** - * Getter for texera_db.user_config.value. - */ - public String getValue(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IUserConfig - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IUserConfig from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IUserConfig - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflow.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflow.java deleted file mode 100644 index e79c8afa2e8..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflow.java +++ /dev/null @@ -1,102 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflow extends Serializable { - - /** - * Setter for texera_db.workflow.name. - */ - public void setName(String value); - - /** - * Getter for texera_db.workflow.name. - */ - public String getName(); - - /** - * Setter for texera_db.workflow.description. - */ - public void setDescription(String value); - - /** - * Getter for texera_db.workflow.description. - */ - public String getDescription(); - - /** - * Setter for texera_db.workflow.wid. - */ - public void setWid(UInteger value); - - /** - * Getter for texera_db.workflow.wid. - */ - public UInteger getWid(); - - /** - * Setter for texera_db.workflow.content. - */ - public void setContent(String value); - - /** - * Getter for texera_db.workflow.content. - */ - public String getContent(); - - /** - * Setter for texera_db.workflow.creation_time. - */ - public void setCreationTime(Timestamp value); - - /** - * Getter for texera_db.workflow.creation_time. - */ - public Timestamp getCreationTime(); - - /** - * Setter for texera_db.workflow.last_modified_time. - */ - public void setLastModifiedTime(Timestamp value); - - /** - * Getter for texera_db.workflow.last_modified_time. - */ - public Timestamp getLastModifiedTime(); - - /** - * Setter for texera_db.workflow.is_published. - */ - public void setIsPublished(Byte value); - - /** - * Getter for texera_db.workflow.is_published. - */ - public Byte getIsPublished(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflow - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflow from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflow - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowExecutions.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowExecutions.java deleted file mode 100644 index c69e3ad84c3..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowExecutions.java +++ /dev/null @@ -1,142 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowExecutions extends Serializable { - - /** - * Setter for texera_db.workflow_executions.eid. - */ - public void setEid(UInteger value); - - /** - * Getter for texera_db.workflow_executions.eid. - */ - public UInteger getEid(); - - /** - * Setter for texera_db.workflow_executions.vid. - */ - public void setVid(UInteger value); - - /** - * Getter for texera_db.workflow_executions.vid. - */ - public UInteger getVid(); - - /** - * Setter for texera_db.workflow_executions.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.workflow_executions.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.workflow_executions.status. - */ - public void setStatus(Byte value); - - /** - * Getter for texera_db.workflow_executions.status. - */ - public Byte getStatus(); - - /** - * Setter for texera_db.workflow_executions.result. - */ - public void setResult(String value); - - /** - * Getter for texera_db.workflow_executions.result. - */ - public String getResult(); - - /** - * Setter for texera_db.workflow_executions.starting_time. - */ - public void setStartingTime(Timestamp value); - - /** - * Getter for texera_db.workflow_executions.starting_time. - */ - public Timestamp getStartingTime(); - - /** - * Setter for texera_db.workflow_executions.last_update_time. - */ - public void setLastUpdateTime(Timestamp value); - - /** - * Getter for texera_db.workflow_executions.last_update_time. - */ - public Timestamp getLastUpdateTime(); - - /** - * Setter for texera_db.workflow_executions.bookmarked. - */ - public void setBookmarked(Byte value); - - /** - * Getter for texera_db.workflow_executions.bookmarked. - */ - public Byte getBookmarked(); - - /** - * Setter for texera_db.workflow_executions.name. - */ - public void setName(String value); - - /** - * Getter for texera_db.workflow_executions.name. - */ - public String getName(); - - /** - * Setter for texera_db.workflow_executions.environment_version. - */ - public void setEnvironmentVersion(String value); - - /** - * Getter for texera_db.workflow_executions.environment_version. - */ - public String getEnvironmentVersion(); - - /** - * Setter for texera_db.workflow_executions.log_location. - */ - public void setLogLocation(String value); - - /** - * Getter for texera_db.workflow_executions.log_location. - */ - public String getLogLocation(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowExecutions - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowExecutions from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowExecutions - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowOfProject.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowOfProject.java deleted file mode 100644 index e0f1f5fcb3d..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowOfProject.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowOfProject extends Serializable { - - /** - * Setter for texera_db.workflow_of_project.wid. - */ - public void setWid(UInteger value); - - /** - * Getter for texera_db.workflow_of_project.wid. - */ - public UInteger getWid(); - - /** - * Setter for texera_db.workflow_of_project.pid. - */ - public void setPid(UInteger value); - - /** - * Getter for texera_db.workflow_of_project.pid. - */ - public UInteger getPid(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowOfProject - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowOfProject from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowOfProject - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowOfUser.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowOfUser.java deleted file mode 100644 index 9620589aac3..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowOfUser.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowOfUser extends Serializable { - - /** - * Setter for texera_db.workflow_of_user.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.workflow_of_user.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.workflow_of_user.wid. - */ - public void setWid(UInteger value); - - /** - * Getter for texera_db.workflow_of_user.wid. - */ - public UInteger getWid(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowOfUser - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowOfUser from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowOfUser - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowRuntimeStatistics.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowRuntimeStatistics.java deleted file mode 100644 index a9ec75b6b0a..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowRuntimeStatistics.java +++ /dev/null @@ -1,143 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; -import org.jooq.types.ULong; - -import java.io.Serializable; -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowRuntimeStatistics extends Serializable { - - /** - * Setter for texera_db.workflow_runtime_statistics.workflow_id. - */ - public void setWorkflowId(UInteger value); - - /** - * Getter for texera_db.workflow_runtime_statistics.workflow_id. - */ - public UInteger getWorkflowId(); - - /** - * Setter for texera_db.workflow_runtime_statistics.execution_id. - */ - public void setExecutionId(UInteger value); - - /** - * Getter for texera_db.workflow_runtime_statistics.execution_id. - */ - public UInteger getExecutionId(); - - /** - * Setter for texera_db.workflow_runtime_statistics.operator_id. - */ - public void setOperatorId(String value); - - /** - * Getter for texera_db.workflow_runtime_statistics.operator_id. - */ - public String getOperatorId(); - - /** - * Setter for texera_db.workflow_runtime_statistics.time. - */ - public void setTime(Timestamp value); - - /** - * Getter for texera_db.workflow_runtime_statistics.time. - */ - public Timestamp getTime(); - - /** - * Setter for texera_db.workflow_runtime_statistics.input_tuple_cnt. - */ - public void setInputTupleCnt(UInteger value); - - /** - * Getter for texera_db.workflow_runtime_statistics.input_tuple_cnt. - */ - public UInteger getInputTupleCnt(); - - /** - * Setter for texera_db.workflow_runtime_statistics.output_tuple_cnt. - */ - public void setOutputTupleCnt(UInteger value); - - /** - * Getter for texera_db.workflow_runtime_statistics.output_tuple_cnt. - */ - public UInteger getOutputTupleCnt(); - - /** - * Setter for texera_db.workflow_runtime_statistics.status. - */ - public void setStatus(Byte value); - - /** - * Getter for texera_db.workflow_runtime_statistics.status. - */ - public Byte getStatus(); - - /** - * Setter for texera_db.workflow_runtime_statistics.data_processing_time. - */ - public void setDataProcessingTime(ULong value); - - /** - * Getter for texera_db.workflow_runtime_statistics.data_processing_time. - */ - public ULong getDataProcessingTime(); - - /** - * Setter for texera_db.workflow_runtime_statistics.control_processing_time. - */ - public void setControlProcessingTime(ULong value); - - /** - * Getter for texera_db.workflow_runtime_statistics.control_processing_time. - */ - public ULong getControlProcessingTime(); - - /** - * Setter for texera_db.workflow_runtime_statistics.idle_time. - */ - public void setIdleTime(ULong value); - - /** - * Getter for texera_db.workflow_runtime_statistics.idle_time. - */ - public ULong getIdleTime(); - - /** - * Setter for texera_db.workflow_runtime_statistics.num_workers. - */ - public void setNumWorkers(UInteger value); - - /** - * Getter for texera_db.workflow_runtime_statistics.num_workers. - */ - public UInteger getNumWorkers(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowRuntimeStatistics - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowRuntimeStatistics from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowRuntimeStatistics - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserAccess.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserAccess.java deleted file mode 100644 index bf23795ab5f..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserAccess.java +++ /dev/null @@ -1,62 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.WorkflowUserAccessPrivilege; -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowUserAccess extends Serializable { - - /** - * Setter for texera_db.workflow_user_access.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.workflow_user_access.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.workflow_user_access.wid. - */ - public void setWid(UInteger value); - - /** - * Getter for texera_db.workflow_user_access.wid. - */ - public UInteger getWid(); - - /** - * Setter for texera_db.workflow_user_access.privilege. - */ - public void setPrivilege(WorkflowUserAccessPrivilege value); - - /** - * Getter for texera_db.workflow_user_access.privilege. - */ - public WorkflowUserAccessPrivilege getPrivilege(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowUserAccess - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserAccess from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowUserAccess - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserActivity.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserActivity.java deleted file mode 100644 index bf9cfa48b66..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserActivity.java +++ /dev/null @@ -1,82 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowUserActivity extends Serializable { - - /** - * Setter for texera_db.workflow_user_activity.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.workflow_user_activity.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.workflow_user_activity.wid. - */ - public void setWid(UInteger value); - - /** - * Getter for texera_db.workflow_user_activity.wid. - */ - public UInteger getWid(); - - /** - * Setter for texera_db.workflow_user_activity.ip. - */ - public void setIp(String value); - - /** - * Getter for texera_db.workflow_user_activity.ip. - */ - public String getIp(); - - /** - * Setter for texera_db.workflow_user_activity.activate. - */ - public void setActivate(String value); - - /** - * Getter for texera_db.workflow_user_activity.activate. - */ - public String getActivate(); - - /** - * Setter for texera_db.workflow_user_activity.activity_time. - */ - public void setActivityTime(Timestamp value); - - /** - * Getter for texera_db.workflow_user_activity.activity_time. - */ - public Timestamp getActivityTime(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowUserActivity - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserActivity from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowUserActivity - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserClones.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserClones.java deleted file mode 100644 index 5bb77c3d4c8..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserClones.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowUserClones extends Serializable { - - /** - * Setter for texera_db.workflow_user_clones.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.workflow_user_clones.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.workflow_user_clones.wid. - */ - public void setWid(UInteger value); - - /** - * Getter for texera_db.workflow_user_clones.wid. - */ - public UInteger getWid(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowUserClones - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserClones from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowUserClones - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserLikes.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserLikes.java deleted file mode 100644 index ae5a5c739e1..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowUserLikes.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowUserLikes extends Serializable { - - /** - * Setter for texera_db.workflow_user_likes.uid. - */ - public void setUid(UInteger value); - - /** - * Getter for texera_db.workflow_user_likes.uid. - */ - public UInteger getUid(); - - /** - * Setter for texera_db.workflow_user_likes.wid. - */ - public void setWid(UInteger value); - - /** - * Getter for texera_db.workflow_user_likes.wid. - */ - public UInteger getWid(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowUserLikes - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserLikes from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowUserLikes - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowVersion.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowVersion.java deleted file mode 100644 index 37b39da1890..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowVersion.java +++ /dev/null @@ -1,72 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowVersion extends Serializable { - - /** - * Setter for texera_db.workflow_version.vid. - */ - public void setVid(UInteger value); - - /** - * Getter for texera_db.workflow_version.vid. - */ - public UInteger getVid(); - - /** - * Setter for texera_db.workflow_version.wid. - */ - public void setWid(UInteger value); - - /** - * Getter for texera_db.workflow_version.wid. - */ - public UInteger getWid(); - - /** - * Setter for texera_db.workflow_version.content. - */ - public void setContent(String value); - - /** - * Getter for texera_db.workflow_version.content. - */ - public String getContent(); - - /** - * Setter for texera_db.workflow_version.creation_time. - */ - public void setCreationTime(Timestamp value); - - /** - * Getter for texera_db.workflow_version.creation_time. - */ - public Timestamp getCreationTime(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowVersion - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowVersion from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowVersion - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowViewCount.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowViewCount.java deleted file mode 100644 index 7773aaa735e..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/interfaces/IWorkflowViewCount.java +++ /dev/null @@ -1,51 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces; - - -import org.jooq.types.UInteger; - -import java.io.Serializable; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public interface IWorkflowViewCount extends Serializable { - - /** - * Setter for texera_db.workflow_view_count.wid. - */ - public void setWid(UInteger value); - - /** - * Getter for texera_db.workflow_view_count.wid. - */ - public UInteger getWid(); - - /** - * Setter for texera_db.workflow_view_count.view_count. - */ - public void setViewCount(UInteger value); - - /** - * Getter for texera_db.workflow_view_count.view_count. - */ - public UInteger getViewCount(); - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - /** - * Load data from another generated Record/POJO implementing the common interface IWorkflowViewCount - */ - public void from(edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowViewCount from); - - /** - * Copy data into another generated Record/POJO implementing the common interface IWorkflowViewCount - */ - public E into(E into); -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Dataset.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Dataset.java deleted file mode 100644 index 88c65148ca1..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Dataset.java +++ /dev/null @@ -1,150 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IDataset; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class Dataset implements IDataset { - - private static final long serialVersionUID = 178066411; - - private UInteger did; - private UInteger ownerUid; - private String name; - private Byte isPublic; - private String description; - private Timestamp creationTime; - - public Dataset() { - } - - public Dataset(IDataset value) { - this.did = value.getDid(); - this.ownerUid = value.getOwnerUid(); - this.name = value.getName(); - this.isPublic = value.getIsPublic(); - this.description = value.getDescription(); - this.creationTime = value.getCreationTime(); - } - - public Dataset( - UInteger did, - UInteger ownerUid, - String name, - Byte isPublic, - String description, - Timestamp creationTime - ) { - this.did = did; - this.ownerUid = ownerUid; - this.name = name; - this.isPublic = isPublic; - this.description = description; - this.creationTime = creationTime; - } - - @Override - public UInteger getDid() { - return this.did; - } - - @Override - public void setDid(UInteger did) { - this.did = did; - } - - @Override - public UInteger getOwnerUid() { - return this.ownerUid; - } - - @Override - public void setOwnerUid(UInteger ownerUid) { - this.ownerUid = ownerUid; - } - - @Override - public String getName() { - return this.name; - } - - @Override - public void setName(String name) { - this.name = name; - } - - @Override - public Byte getIsPublic() { - return this.isPublic; - } - - @Override - public void setIsPublic(Byte isPublic) { - this.isPublic = isPublic; - } - - @Override - public String getDescription() { - return this.description; - } - - @Override - public void setDescription(String description) { - this.description = description; - } - - @Override - public Timestamp getCreationTime() { - return this.creationTime; - } - - @Override - public void setCreationTime(Timestamp creationTime) { - this.creationTime = creationTime; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("Dataset ("); - - sb.append(did); - sb.append(", ").append(ownerUid); - sb.append(", ").append(name); - sb.append(", ").append(isPublic); - sb.append(", ").append(description); - sb.append(", ").append(creationTime); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IDataset from) { - setDid(from.getDid()); - setOwnerUid(from.getOwnerUid()); - setName(from.getName()); - setIsPublic(from.getIsPublic()); - setDescription(from.getDescription()); - setCreationTime(from.getCreationTime()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/DatasetUserAccess.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/DatasetUserAccess.java deleted file mode 100644 index e4cb34acd0e..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/DatasetUserAccess.java +++ /dev/null @@ -1,101 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.DatasetUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IDatasetUserAccess; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetUserAccess implements IDatasetUserAccess { - - private static final long serialVersionUID = 1783622787; - - private UInteger did; - private UInteger uid; - private DatasetUserAccessPrivilege privilege; - - public DatasetUserAccess() { - } - - public DatasetUserAccess(IDatasetUserAccess value) { - this.did = value.getDid(); - this.uid = value.getUid(); - this.privilege = value.getPrivilege(); - } - - public DatasetUserAccess( - UInteger did, - UInteger uid, - DatasetUserAccessPrivilege privilege - ) { - this.did = did; - this.uid = uid; - this.privilege = privilege; - } - - @Override - public UInteger getDid() { - return this.did; - } - - @Override - public void setDid(UInteger did) { - this.did = did; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public DatasetUserAccessPrivilege getPrivilege() { - return this.privilege; - } - - @Override - public void setPrivilege(DatasetUserAccessPrivilege privilege) { - this.privilege = privilege; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("DatasetUserAccess ("); - - sb.append(did); - sb.append(", ").append(uid); - sb.append(", ").append(privilege); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IDatasetUserAccess from) { - setDid(from.getDid()); - setUid(from.getUid()); - setPrivilege(from.getPrivilege()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/DatasetVersion.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/DatasetVersion.java deleted file mode 100644 index 4c8cb453ed8..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/DatasetVersion.java +++ /dev/null @@ -1,150 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IDatasetVersion; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetVersion implements IDatasetVersion { - - private static final long serialVersionUID = -1253265124; - - private UInteger dvid; - private UInteger did; - private UInteger creatorUid; - private String name; - private String versionHash; - private Timestamp creationTime; - - public DatasetVersion() { - } - - public DatasetVersion(IDatasetVersion value) { - this.dvid = value.getDvid(); - this.did = value.getDid(); - this.creatorUid = value.getCreatorUid(); - this.name = value.getName(); - this.versionHash = value.getVersionHash(); - this.creationTime = value.getCreationTime(); - } - - public DatasetVersion( - UInteger dvid, - UInteger did, - UInteger creatorUid, - String name, - String versionHash, - Timestamp creationTime - ) { - this.dvid = dvid; - this.did = did; - this.creatorUid = creatorUid; - this.name = name; - this.versionHash = versionHash; - this.creationTime = creationTime; - } - - @Override - public UInteger getDvid() { - return this.dvid; - } - - @Override - public void setDvid(UInteger dvid) { - this.dvid = dvid; - } - - @Override - public UInteger getDid() { - return this.did; - } - - @Override - public void setDid(UInteger did) { - this.did = did; - } - - @Override - public UInteger getCreatorUid() { - return this.creatorUid; - } - - @Override - public void setCreatorUid(UInteger creatorUid) { - this.creatorUid = creatorUid; - } - - @Override - public String getName() { - return this.name; - } - - @Override - public void setName(String name) { - this.name = name; - } - - @Override - public String getVersionHash() { - return this.versionHash; - } - - @Override - public void setVersionHash(String versionHash) { - this.versionHash = versionHash; - } - - @Override - public Timestamp getCreationTime() { - return this.creationTime; - } - - @Override - public void setCreationTime(Timestamp creationTime) { - this.creationTime = creationTime; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("DatasetVersion ("); - - sb.append(dvid); - sb.append(", ").append(did); - sb.append(", ").append(creatorUid); - sb.append(", ").append(name); - sb.append(", ").append(versionHash); - sb.append(", ").append(creationTime); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IDatasetVersion from) { - setDvid(from.getDvid()); - setDid(from.getDid()); - setCreatorUid(from.getCreatorUid()); - setName(from.getName()); - setVersionHash(from.getVersionHash()); - setCreationTime(from.getCreationTime()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Project.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Project.java deleted file mode 100644 index b8836242048..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Project.java +++ /dev/null @@ -1,150 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IProject; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class Project implements IProject { - - private static final long serialVersionUID = 1494540996; - - private UInteger pid; - private String name; - private String description; - private UInteger ownerId; - private Timestamp creationTime; - private String color; - - public Project() { - } - - public Project(IProject value) { - this.pid = value.getPid(); - this.name = value.getName(); - this.description = value.getDescription(); - this.ownerId = value.getOwnerId(); - this.creationTime = value.getCreationTime(); - this.color = value.getColor(); - } - - public Project( - UInteger pid, - String name, - String description, - UInteger ownerId, - Timestamp creationTime, - String color - ) { - this.pid = pid; - this.name = name; - this.description = description; - this.ownerId = ownerId; - this.creationTime = creationTime; - this.color = color; - } - - @Override - public UInteger getPid() { - return this.pid; - } - - @Override - public void setPid(UInteger pid) { - this.pid = pid; - } - - @Override - public String getName() { - return this.name; - } - - @Override - public void setName(String name) { - this.name = name; - } - - @Override - public String getDescription() { - return this.description; - } - - @Override - public void setDescription(String description) { - this.description = description; - } - - @Override - public UInteger getOwnerId() { - return this.ownerId; - } - - @Override - public void setOwnerId(UInteger ownerId) { - this.ownerId = ownerId; - } - - @Override - public Timestamp getCreationTime() { - return this.creationTime; - } - - @Override - public void setCreationTime(Timestamp creationTime) { - this.creationTime = creationTime; - } - - @Override - public String getColor() { - return this.color; - } - - @Override - public void setColor(String color) { - this.color = color; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("Project ("); - - sb.append(pid); - sb.append(", ").append(name); - sb.append(", ").append(description); - sb.append(", ").append(ownerId); - sb.append(", ").append(creationTime); - sb.append(", ").append(color); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IProject from) { - setPid(from.getPid()); - setName(from.getName()); - setDescription(from.getDescription()); - setOwnerId(from.getOwnerId()); - setCreationTime(from.getCreationTime()); - setColor(from.getColor()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/ProjectUserAccess.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/ProjectUserAccess.java deleted file mode 100644 index 885b6b5c623..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/ProjectUserAccess.java +++ /dev/null @@ -1,101 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.ProjectUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IProjectUserAccess; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class ProjectUserAccess implements IProjectUserAccess { - - private static final long serialVersionUID = -27326996; - - private UInteger uid; - private UInteger pid; - private ProjectUserAccessPrivilege privilege; - - public ProjectUserAccess() { - } - - public ProjectUserAccess(IProjectUserAccess value) { - this.uid = value.getUid(); - this.pid = value.getPid(); - this.privilege = value.getPrivilege(); - } - - public ProjectUserAccess( - UInteger uid, - UInteger pid, - ProjectUserAccessPrivilege privilege - ) { - this.uid = uid; - this.pid = pid; - this.privilege = privilege; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public UInteger getPid() { - return this.pid; - } - - @Override - public void setPid(UInteger pid) { - this.pid = pid; - } - - @Override - public ProjectUserAccessPrivilege getPrivilege() { - return this.privilege; - } - - @Override - public void setPrivilege(ProjectUserAccessPrivilege privilege) { - this.privilege = privilege; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("ProjectUserAccess ("); - - sb.append(uid); - sb.append(", ").append(pid); - sb.append(", ").append(privilege); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IProjectUserAccess from) { - setUid(from.getUid()); - setPid(from.getPid()); - setPrivilege(from.getPrivilege()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/PublicProject.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/PublicProject.java deleted file mode 100644 index 1273fe8586c..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/PublicProject.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IPublicProject; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class PublicProject implements IPublicProject { - - private static final long serialVersionUID = -956065810; - - private UInteger pid; - private UInteger uid; - - public PublicProject() { - } - - public PublicProject(IPublicProject value) { - this.pid = value.getPid(); - this.uid = value.getUid(); - } - - public PublicProject( - UInteger pid, - UInteger uid - ) { - this.pid = pid; - this.uid = uid; - } - - @Override - public UInteger getPid() { - return this.pid; - } - - @Override - public void setPid(UInteger pid) { - this.pid = pid; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("PublicProject ("); - - sb.append(pid); - sb.append(", ").append(uid); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IPublicProject from) { - setPid(from.getPid()); - setUid(from.getUid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/User.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/User.java deleted file mode 100644 index da4c3574eb1..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/User.java +++ /dev/null @@ -1,165 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IUser; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class User implements IUser { - - private static final long serialVersionUID = -1446129198; - - private UInteger uid; - private String name; - private String email; - private String password; - private String googleId; - private UserRole role; - private String googleAvatar; - - public User() { - } - - public User(IUser value) { - this.uid = value.getUid(); - this.name = value.getName(); - this.email = value.getEmail(); - this.password = value.getPassword(); - this.googleId = value.getGoogleId(); - this.role = value.getRole(); - this.googleAvatar = value.getGoogleAvatar(); - } - - public User( - UInteger uid, - String name, - String email, - String password, - String googleId, - UserRole role, - String googleAvatar - ) { - this.uid = uid; - this.name = name; - this.email = email; - this.password = password; - this.googleId = googleId; - this.role = role; - this.googleAvatar = googleAvatar; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public String getName() { - return this.name; - } - - @Override - public void setName(String name) { - this.name = name; - } - - @Override - public String getEmail() { - return this.email; - } - - @Override - public void setEmail(String email) { - this.email = email; - } - - @Override - public String getPassword() { - return this.password; - } - - @Override - public void setPassword(String password) { - this.password = password; - } - - @Override - public String getGoogleId() { - return this.googleId; - } - - @Override - public void setGoogleId(String googleId) { - this.googleId = googleId; - } - - @Override - public UserRole getRole() { - return this.role; - } - - @Override - public void setRole(UserRole role) { - this.role = role; - } - - @Override - public String getGoogleAvatar() { - return this.googleAvatar; - } - - @Override - public void setGoogleAvatar(String googleAvatar) { - this.googleAvatar = googleAvatar; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("User ("); - - sb.append(uid); - sb.append(", ").append(name); - sb.append(", ").append(email); - sb.append(", ").append(password); - sb.append(", ").append(googleId); - sb.append(", ").append(role); - sb.append(", ").append(googleAvatar); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IUser from) { - setUid(from.getUid()); - setName(from.getName()); - setEmail(from.getEmail()); - setPassword(from.getPassword()); - setGoogleId(from.getGoogleId()); - setRole(from.getRole()); - setGoogleAvatar(from.getGoogleAvatar()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/UserConfig.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/UserConfig.java deleted file mode 100644 index 8ac35264709..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/UserConfig.java +++ /dev/null @@ -1,100 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IUserConfig; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class UserConfig implements IUserConfig { - - private static final long serialVersionUID = 757070019; - - private UInteger uid; - private String key; - private String value; - - public UserConfig() { - } - - public UserConfig(IUserConfig value) { - this.uid = value.getUid(); - this.key = value.getKey(); - this.value = value.getValue(); - } - - public UserConfig( - UInteger uid, - String key, - String value - ) { - this.uid = uid; - this.key = key; - this.value = value; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public String getKey() { - return this.key; - } - - @Override - public void setKey(String key) { - this.key = key; - } - - @Override - public String getValue() { - return this.value; - } - - @Override - public void setValue(String value) { - this.value = value; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("UserConfig ("); - - sb.append(uid); - sb.append(", ").append(key); - sb.append(", ").append(value); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IUserConfig from) { - setUid(from.getUid()); - setKey(from.getKey()); - setValue(from.getValue()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Workflow.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Workflow.java deleted file mode 100644 index 0da026af978..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/Workflow.java +++ /dev/null @@ -1,166 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflow; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class Workflow implements IWorkflow { - - private static final long serialVersionUID = 171585219; - - private String name; - private String description; - private UInteger wid; - private String content; - private Timestamp creationTime; - private Timestamp lastModifiedTime; - private Byte isPublished; - - public Workflow() { - } - - public Workflow(IWorkflow value) { - this.name = value.getName(); - this.description = value.getDescription(); - this.wid = value.getWid(); - this.content = value.getContent(); - this.creationTime = value.getCreationTime(); - this.lastModifiedTime = value.getLastModifiedTime(); - this.isPublished = value.getIsPublished(); - } - - public Workflow( - String name, - String description, - UInteger wid, - String content, - Timestamp creationTime, - Timestamp lastModifiedTime, - Byte isPublished - ) { - this.name = name; - this.description = description; - this.wid = wid; - this.content = content; - this.creationTime = creationTime; - this.lastModifiedTime = lastModifiedTime; - this.isPublished = isPublished; - } - - @Override - public String getName() { - return this.name; - } - - @Override - public void setName(String name) { - this.name = name; - } - - @Override - public String getDescription() { - return this.description; - } - - @Override - public void setDescription(String description) { - this.description = description; - } - - @Override - public UInteger getWid() { - return this.wid; - } - - @Override - public void setWid(UInteger wid) { - this.wid = wid; - } - - @Override - public String getContent() { - return this.content; - } - - @Override - public void setContent(String content) { - this.content = content; - } - - @Override - public Timestamp getCreationTime() { - return this.creationTime; - } - - @Override - public void setCreationTime(Timestamp creationTime) { - this.creationTime = creationTime; - } - - @Override - public Timestamp getLastModifiedTime() { - return this.lastModifiedTime; - } - - @Override - public void setLastModifiedTime(Timestamp lastModifiedTime) { - this.lastModifiedTime = lastModifiedTime; - } - - @Override - public Byte getIsPublished() { - return this.isPublished; - } - - @Override - public void setIsPublished(Byte isPublished) { - this.isPublished = isPublished; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("Workflow ("); - - sb.append(name); - sb.append(", ").append(description); - sb.append(", ").append(wid); - sb.append(", ").append(content); - sb.append(", ").append(creationTime); - sb.append(", ").append(lastModifiedTime); - sb.append(", ").append(isPublished); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflow from) { - setName(from.getName()); - setDescription(from.getDescription()); - setWid(from.getWid()); - setContent(from.getContent()); - setCreationTime(from.getCreationTime()); - setLastModifiedTime(from.getLastModifiedTime()); - setIsPublished(from.getIsPublished()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowExecutions.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowExecutions.java deleted file mode 100644 index e4a46eb8aec..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowExecutions.java +++ /dev/null @@ -1,230 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowExecutions; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowExecutions implements IWorkflowExecutions { - - private static final long serialVersionUID = -379967912; - - private UInteger eid; - private UInteger vid; - private UInteger uid; - private Byte status; - private String result; - private Timestamp startingTime; - private Timestamp lastUpdateTime; - private Byte bookmarked; - private String name; - private String environmentVersion; - private String logLocation; - - public WorkflowExecutions() { - } - - public WorkflowExecutions(IWorkflowExecutions value) { - this.eid = value.getEid(); - this.vid = value.getVid(); - this.uid = value.getUid(); - this.status = value.getStatus(); - this.result = value.getResult(); - this.startingTime = value.getStartingTime(); - this.lastUpdateTime = value.getLastUpdateTime(); - this.bookmarked = value.getBookmarked(); - this.name = value.getName(); - this.environmentVersion = value.getEnvironmentVersion(); - this.logLocation = value.getLogLocation(); - } - - public WorkflowExecutions( - UInteger eid, - UInteger vid, - UInteger uid, - Byte status, - String result, - Timestamp startingTime, - Timestamp lastUpdateTime, - Byte bookmarked, - String name, - String environmentVersion, - String logLocation - ) { - this.eid = eid; - this.vid = vid; - this.uid = uid; - this.status = status; - this.result = result; - this.startingTime = startingTime; - this.lastUpdateTime = lastUpdateTime; - this.bookmarked = bookmarked; - this.name = name; - this.environmentVersion = environmentVersion; - this.logLocation = logLocation; - } - - @Override - public UInteger getEid() { - return this.eid; - } - - @Override - public void setEid(UInteger eid) { - this.eid = eid; - } - - @Override - public UInteger getVid() { - return this.vid; - } - - @Override - public void setVid(UInteger vid) { - this.vid = vid; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public Byte getStatus() { - return this.status; - } - - @Override - public void setStatus(Byte status) { - this.status = status; - } - - @Override - public String getResult() { - return this.result; - } - - @Override - public void setResult(String result) { - this.result = result; - } - - @Override - public Timestamp getStartingTime() { - return this.startingTime; - } - - @Override - public void setStartingTime(Timestamp startingTime) { - this.startingTime = startingTime; - } - - @Override - public Timestamp getLastUpdateTime() { - return this.lastUpdateTime; - } - - @Override - public void setLastUpdateTime(Timestamp lastUpdateTime) { - this.lastUpdateTime = lastUpdateTime; - } - - @Override - public Byte getBookmarked() { - return this.bookmarked; - } - - @Override - public void setBookmarked(Byte bookmarked) { - this.bookmarked = bookmarked; - } - - @Override - public String getName() { - return this.name; - } - - @Override - public void setName(String name) { - this.name = name; - } - - @Override - public String getEnvironmentVersion() { - return this.environmentVersion; - } - - @Override - public void setEnvironmentVersion(String environmentVersion) { - this.environmentVersion = environmentVersion; - } - - @Override - public String getLogLocation() { - return this.logLocation; - } - - @Override - public void setLogLocation(String logLocation) { - this.logLocation = logLocation; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowExecutions ("); - - sb.append(eid); - sb.append(", ").append(vid); - sb.append(", ").append(uid); - sb.append(", ").append(status); - sb.append(", ").append(result); - sb.append(", ").append(startingTime); - sb.append(", ").append(lastUpdateTime); - sb.append(", ").append(bookmarked); - sb.append(", ").append(name); - sb.append(", ").append(environmentVersion); - sb.append(", ").append(logLocation); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowExecutions from) { - setEid(from.getEid()); - setVid(from.getVid()); - setUid(from.getUid()); - setStatus(from.getStatus()); - setResult(from.getResult()); - setStartingTime(from.getStartingTime()); - setLastUpdateTime(from.getLastUpdateTime()); - setBookmarked(from.getBookmarked()); - setName(from.getName()); - setEnvironmentVersion(from.getEnvironmentVersion()); - setLogLocation(from.getLogLocation()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowOfProject.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowOfProject.java deleted file mode 100644 index 7376ffa8c61..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowOfProject.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowOfProject; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowOfProject implements IWorkflowOfProject { - - private static final long serialVersionUID = -2015179902; - - private UInteger wid; - private UInteger pid; - - public WorkflowOfProject() { - } - - public WorkflowOfProject(IWorkflowOfProject value) { - this.wid = value.getWid(); - this.pid = value.getPid(); - } - - public WorkflowOfProject( - UInteger wid, - UInteger pid - ) { - this.wid = wid; - this.pid = pid; - } - - @Override - public UInteger getWid() { - return this.wid; - } - - @Override - public void setWid(UInteger wid) { - this.wid = wid; - } - - @Override - public UInteger getPid() { - return this.pid; - } - - @Override - public void setPid(UInteger pid) { - this.pid = pid; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowOfProject ("); - - sb.append(wid); - sb.append(", ").append(pid); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowOfProject from) { - setWid(from.getWid()); - setPid(from.getPid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowOfUser.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowOfUser.java deleted file mode 100644 index 797c74e75b1..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowOfUser.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowOfUser; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowOfUser implements IWorkflowOfUser { - - private static final long serialVersionUID = -698732761; - - private UInteger uid; - private UInteger wid; - - public WorkflowOfUser() { - } - - public WorkflowOfUser(IWorkflowOfUser value) { - this.uid = value.getUid(); - this.wid = value.getWid(); - } - - public WorkflowOfUser( - UInteger uid, - UInteger wid - ) { - this.uid = uid; - this.wid = wid; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public UInteger getWid() { - return this.wid; - } - - @Override - public void setWid(UInteger wid) { - this.wid = wid; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowOfUser ("); - - sb.append(uid); - sb.append(", ").append(wid); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowOfUser from) { - setUid(from.getUid()); - setWid(from.getWid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowRuntimeStatistics.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowRuntimeStatistics.java deleted file mode 100644 index ea796fc6b9e..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowRuntimeStatistics.java +++ /dev/null @@ -1,231 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowRuntimeStatistics; -import org.jooq.types.UInteger; -import org.jooq.types.ULong; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowRuntimeStatistics implements IWorkflowRuntimeStatistics { - - private static final long serialVersionUID = -2045109677; - - private UInteger workflowId; - private UInteger executionId; - private String operatorId; - private Timestamp time; - private UInteger inputTupleCnt; - private UInteger outputTupleCnt; - private Byte status; - private ULong dataProcessingTime; - private ULong controlProcessingTime; - private ULong idleTime; - private UInteger numWorkers; - - public WorkflowRuntimeStatistics() { - } - - public WorkflowRuntimeStatistics(IWorkflowRuntimeStatistics value) { - this.workflowId = value.getWorkflowId(); - this.executionId = value.getExecutionId(); - this.operatorId = value.getOperatorId(); - this.time = value.getTime(); - this.inputTupleCnt = value.getInputTupleCnt(); - this.outputTupleCnt = value.getOutputTupleCnt(); - this.status = value.getStatus(); - this.dataProcessingTime = value.getDataProcessingTime(); - this.controlProcessingTime = value.getControlProcessingTime(); - this.idleTime = value.getIdleTime(); - this.numWorkers = value.getNumWorkers(); - } - - public WorkflowRuntimeStatistics( - UInteger workflowId, - UInteger executionId, - String operatorId, - Timestamp time, - UInteger inputTupleCnt, - UInteger outputTupleCnt, - Byte status, - ULong dataProcessingTime, - ULong controlProcessingTime, - ULong idleTime, - UInteger numWorkers - ) { - this.workflowId = workflowId; - this.executionId = executionId; - this.operatorId = operatorId; - this.time = time; - this.inputTupleCnt = inputTupleCnt; - this.outputTupleCnt = outputTupleCnt; - this.status = status; - this.dataProcessingTime = dataProcessingTime; - this.controlProcessingTime = controlProcessingTime; - this.idleTime = idleTime; - this.numWorkers = numWorkers; - } - - @Override - public UInteger getWorkflowId() { - return this.workflowId; - } - - @Override - public void setWorkflowId(UInteger workflowId) { - this.workflowId = workflowId; - } - - @Override - public UInteger getExecutionId() { - return this.executionId; - } - - @Override - public void setExecutionId(UInteger executionId) { - this.executionId = executionId; - } - - @Override - public String getOperatorId() { - return this.operatorId; - } - - @Override - public void setOperatorId(String operatorId) { - this.operatorId = operatorId; - } - - @Override - public Timestamp getTime() { - return this.time; - } - - @Override - public void setTime(Timestamp time) { - this.time = time; - } - - @Override - public UInteger getInputTupleCnt() { - return this.inputTupleCnt; - } - - @Override - public void setInputTupleCnt(UInteger inputTupleCnt) { - this.inputTupleCnt = inputTupleCnt; - } - - @Override - public UInteger getOutputTupleCnt() { - return this.outputTupleCnt; - } - - @Override - public void setOutputTupleCnt(UInteger outputTupleCnt) { - this.outputTupleCnt = outputTupleCnt; - } - - @Override - public Byte getStatus() { - return this.status; - } - - @Override - public void setStatus(Byte status) { - this.status = status; - } - - @Override - public ULong getDataProcessingTime() { - return this.dataProcessingTime; - } - - @Override - public void setDataProcessingTime(ULong dataProcessingTime) { - this.dataProcessingTime = dataProcessingTime; - } - - @Override - public ULong getControlProcessingTime() { - return this.controlProcessingTime; - } - - @Override - public void setControlProcessingTime(ULong controlProcessingTime) { - this.controlProcessingTime = controlProcessingTime; - } - - @Override - public ULong getIdleTime() { - return this.idleTime; - } - - @Override - public void setIdleTime(ULong idleTime) { - this.idleTime = idleTime; - } - - @Override - public UInteger getNumWorkers() { - return this.numWorkers; - } - - @Override - public void setNumWorkers(UInteger numWorkers) { - this.numWorkers = numWorkers; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowRuntimeStatistics ("); - - sb.append(workflowId); - sb.append(", ").append(executionId); - sb.append(", ").append(operatorId); - sb.append(", ").append(time); - sb.append(", ").append(inputTupleCnt); - sb.append(", ").append(outputTupleCnt); - sb.append(", ").append(status); - sb.append(", ").append(dataProcessingTime); - sb.append(", ").append(controlProcessingTime); - sb.append(", ").append(idleTime); - sb.append(", ").append(numWorkers); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowRuntimeStatistics from) { - setWorkflowId(from.getWorkflowId()); - setExecutionId(from.getExecutionId()); - setOperatorId(from.getOperatorId()); - setTime(from.getTime()); - setInputTupleCnt(from.getInputTupleCnt()); - setOutputTupleCnt(from.getOutputTupleCnt()); - setStatus(from.getStatus()); - setDataProcessingTime(from.getDataProcessingTime()); - setControlProcessingTime(from.getControlProcessingTime()); - setIdleTime(from.getIdleTime()); - setNumWorkers(from.getNumWorkers()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserAccess.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserAccess.java deleted file mode 100644 index 9993c4dff54..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserAccess.java +++ /dev/null @@ -1,101 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.WorkflowUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserAccess; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserAccess implements IWorkflowUserAccess { - - private static final long serialVersionUID = -354047803; - - private UInteger uid; - private UInteger wid; - private WorkflowUserAccessPrivilege privilege; - - public WorkflowUserAccess() { - } - - public WorkflowUserAccess(IWorkflowUserAccess value) { - this.uid = value.getUid(); - this.wid = value.getWid(); - this.privilege = value.getPrivilege(); - } - - public WorkflowUserAccess( - UInteger uid, - UInteger wid, - WorkflowUserAccessPrivilege privilege - ) { - this.uid = uid; - this.wid = wid; - this.privilege = privilege; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public UInteger getWid() { - return this.wid; - } - - @Override - public void setWid(UInteger wid) { - this.wid = wid; - } - - @Override - public WorkflowUserAccessPrivilege getPrivilege() { - return this.privilege; - } - - @Override - public void setPrivilege(WorkflowUserAccessPrivilege privilege) { - this.privilege = privilege; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowUserAccess ("); - - sb.append(uid); - sb.append(", ").append(wid); - sb.append(", ").append(privilege); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowUserAccess from) { - setUid(from.getUid()); - setWid(from.getWid()); - setPrivilege(from.getPrivilege()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserActivity.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserActivity.java deleted file mode 100644 index 03ef8107ac1..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserActivity.java +++ /dev/null @@ -1,134 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserActivity; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserActivity implements IWorkflowUserActivity { - - private static final long serialVersionUID = 245043363; - - private UInteger uid; - private UInteger wid; - private String ip; - private String activate; - private Timestamp activityTime; - - public WorkflowUserActivity() { - } - - public WorkflowUserActivity(IWorkflowUserActivity value) { - this.uid = value.getUid(); - this.wid = value.getWid(); - this.ip = value.getIp(); - this.activate = value.getActivate(); - this.activityTime = value.getActivityTime(); - } - - public WorkflowUserActivity( - UInteger uid, - UInteger wid, - String ip, - String activate, - Timestamp activityTime - ) { - this.uid = uid; - this.wid = wid; - this.ip = ip; - this.activate = activate; - this.activityTime = activityTime; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public UInteger getWid() { - return this.wid; - } - - @Override - public void setWid(UInteger wid) { - this.wid = wid; - } - - @Override - public String getIp() { - return this.ip; - } - - @Override - public void setIp(String ip) { - this.ip = ip; - } - - @Override - public String getActivate() { - return this.activate; - } - - @Override - public void setActivate(String activate) { - this.activate = activate; - } - - @Override - public Timestamp getActivityTime() { - return this.activityTime; - } - - @Override - public void setActivityTime(Timestamp activityTime) { - this.activityTime = activityTime; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowUserActivity ("); - - sb.append(uid); - sb.append(", ").append(wid); - sb.append(", ").append(ip); - sb.append(", ").append(activate); - sb.append(", ").append(activityTime); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowUserActivity from) { - setUid(from.getUid()); - setWid(from.getWid()); - setIp(from.getIp()); - setActivate(from.getActivate()); - setActivityTime(from.getActivityTime()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserClones.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserClones.java deleted file mode 100644 index 5dc064aff06..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserClones.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserClones; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserClones implements IWorkflowUserClones { - - private static final long serialVersionUID = 251192233; - - private UInteger uid; - private UInteger wid; - - public WorkflowUserClones() { - } - - public WorkflowUserClones(IWorkflowUserClones value) { - this.uid = value.getUid(); - this.wid = value.getWid(); - } - - public WorkflowUserClones( - UInteger uid, - UInteger wid - ) { - this.uid = uid; - this.wid = wid; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public UInteger getWid() { - return this.wid; - } - - @Override - public void setWid(UInteger wid) { - this.wid = wid; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowUserClones ("); - - sb.append(uid); - sb.append(", ").append(wid); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowUserClones from) { - setUid(from.getUid()); - setWid(from.getWid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserLikes.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserLikes.java deleted file mode 100644 index f2d6600c34d..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowUserLikes.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserLikes; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserLikes implements IWorkflowUserLikes { - - private static final long serialVersionUID = -632412413; - - private UInteger uid; - private UInteger wid; - - public WorkflowUserLikes() { - } - - public WorkflowUserLikes(IWorkflowUserLikes value) { - this.uid = value.getUid(); - this.wid = value.getWid(); - } - - public WorkflowUserLikes( - UInteger uid, - UInteger wid - ) { - this.uid = uid; - this.wid = wid; - } - - @Override - public UInteger getUid() { - return this.uid; - } - - @Override - public void setUid(UInteger uid) { - this.uid = uid; - } - - @Override - public UInteger getWid() { - return this.wid; - } - - @Override - public void setWid(UInteger wid) { - this.wid = wid; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowUserLikes ("); - - sb.append(uid); - sb.append(", ").append(wid); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowUserLikes from) { - setUid(from.getUid()); - setWid(from.getWid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowVersion.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowVersion.java deleted file mode 100644 index ac4d0d24f8b..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowVersion.java +++ /dev/null @@ -1,118 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowVersion; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowVersion implements IWorkflowVersion { - - private static final long serialVersionUID = -1278018119; - - private UInteger vid; - private UInteger wid; - private String content; - private Timestamp creationTime; - - public WorkflowVersion() { - } - - public WorkflowVersion(IWorkflowVersion value) { - this.vid = value.getVid(); - this.wid = value.getWid(); - this.content = value.getContent(); - this.creationTime = value.getCreationTime(); - } - - public WorkflowVersion( - UInteger vid, - UInteger wid, - String content, - Timestamp creationTime - ) { - this.vid = vid; - this.wid = wid; - this.content = content; - this.creationTime = creationTime; - } - - @Override - public UInteger getVid() { - return this.vid; - } - - @Override - public void setVid(UInteger vid) { - this.vid = vid; - } - - @Override - public UInteger getWid() { - return this.wid; - } - - @Override - public void setWid(UInteger wid) { - this.wid = wid; - } - - @Override - public String getContent() { - return this.content; - } - - @Override - public void setContent(String content) { - this.content = content; - } - - @Override - public Timestamp getCreationTime() { - return this.creationTime; - } - - @Override - public void setCreationTime(Timestamp creationTime) { - this.creationTime = creationTime; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowVersion ("); - - sb.append(vid); - sb.append(", ").append(wid); - sb.append(", ").append(content); - sb.append(", ").append(creationTime); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowVersion from) { - setVid(from.getVid()); - setWid(from.getWid()); - setContent(from.getContent()); - setCreationTime(from.getCreationTime()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowViewCount.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowViewCount.java deleted file mode 100644 index 9227ed373aa..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/pojos/WorkflowViewCount.java +++ /dev/null @@ -1,84 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.pojos; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowViewCount; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowViewCount implements IWorkflowViewCount { - - private static final long serialVersionUID = 2069501897; - - private UInteger wid; - private UInteger viewCount; - - public WorkflowViewCount() { - } - - public WorkflowViewCount(IWorkflowViewCount value) { - this.wid = value.getWid(); - this.viewCount = value.getViewCount(); - } - - public WorkflowViewCount( - UInteger wid, - UInteger viewCount - ) { - this.wid = wid; - this.viewCount = viewCount; - } - - @Override - public UInteger getWid() { - return this.wid; - } - - @Override - public void setWid(UInteger wid) { - this.wid = wid; - } - - @Override - public UInteger getViewCount() { - return this.viewCount; - } - - @Override - public void setViewCount(UInteger viewCount) { - this.viewCount = viewCount; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("WorkflowViewCount ("); - - sb.append(wid); - sb.append(", ").append(viewCount); - - sb.append(")"); - return sb.toString(); - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowViewCount from) { - setWid(from.getWid()); - setViewCount(from.getViewCount()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetRecord.java deleted file mode 100644 index c8a32c43d9a..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetRecord.java +++ /dev/null @@ -1,327 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.Dataset; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IDataset; -import org.jooq.Field; -import org.jooq.Record1; -import org.jooq.Record6; -import org.jooq.Row6; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetRecord extends UpdatableRecordImpl implements Record6, IDataset { - - private static final long serialVersionUID = -568387354; - - /** - * Setter for texera_db.dataset.did. - */ - @Override - public void setDid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.dataset.did. - */ - @Override - public UInteger getDid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.dataset.owner_uid. - */ - @Override - public void setOwnerUid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.dataset.owner_uid. - */ - @Override - public UInteger getOwnerUid() { - return (UInteger) get(1); - } - - /** - * Setter for texera_db.dataset.name. - */ - @Override - public void setName(String value) { - set(2, value); - } - - /** - * Getter for texera_db.dataset.name. - */ - @Override - public String getName() { - return (String) get(2); - } - - /** - * Setter for texera_db.dataset.is_public. - */ - @Override - public void setIsPublic(Byte value) { - set(3, value); - } - - /** - * Getter for texera_db.dataset.is_public. - */ - @Override - public Byte getIsPublic() { - return (Byte) get(3); - } - - /** - * Setter for texera_db.dataset.description. - */ - @Override - public void setDescription(String value) { - set(4, value); - } - - /** - * Getter for texera_db.dataset.description. - */ - @Override - public String getDescription() { - return (String) get(4); - } - - /** - * Setter for texera_db.dataset.creation_time. - */ - @Override - public void setCreationTime(Timestamp value) { - set(5, value); - } - - /** - * Getter for texera_db.dataset.creation_time. - */ - @Override - public Timestamp getCreationTime() { - return (Timestamp) get(5); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record1 key() { - return (Record1) super.key(); - } - - // ------------------------------------------------------------------------- - // Record6 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row6 fieldsRow() { - return (Row6) super.fieldsRow(); - } - - @Override - public Row6 valuesRow() { - return (Row6) super.valuesRow(); - } - - @Override - public Field field1() { - return Dataset.DATASET.DID; - } - - @Override - public Field field2() { - return Dataset.DATASET.OWNER_UID; - } - - @Override - public Field field3() { - return Dataset.DATASET.NAME; - } - - @Override - public Field field4() { - return Dataset.DATASET.IS_PUBLIC; - } - - @Override - public Field field5() { - return Dataset.DATASET.DESCRIPTION; - } - - @Override - public Field field6() { - return Dataset.DATASET.CREATION_TIME; - } - - @Override - public UInteger component1() { - return getDid(); - } - - @Override - public UInteger component2() { - return getOwnerUid(); - } - - @Override - public String component3() { - return getName(); - } - - @Override - public Byte component4() { - return getIsPublic(); - } - - @Override - public String component5() { - return getDescription(); - } - - @Override - public Timestamp component6() { - return getCreationTime(); - } - - @Override - public UInteger value1() { - return getDid(); - } - - @Override - public UInteger value2() { - return getOwnerUid(); - } - - @Override - public String value3() { - return getName(); - } - - @Override - public Byte value4() { - return getIsPublic(); - } - - @Override - public String value5() { - return getDescription(); - } - - @Override - public Timestamp value6() { - return getCreationTime(); - } - - @Override - public DatasetRecord value1(UInteger value) { - setDid(value); - return this; - } - - @Override - public DatasetRecord value2(UInteger value) { - setOwnerUid(value); - return this; - } - - @Override - public DatasetRecord value3(String value) { - setName(value); - return this; - } - - @Override - public DatasetRecord value4(Byte value) { - setIsPublic(value); - return this; - } - - @Override - public DatasetRecord value5(String value) { - setDescription(value); - return this; - } - - @Override - public DatasetRecord value6(Timestamp value) { - setCreationTime(value); - return this; - } - - @Override - public DatasetRecord values(UInteger value1, UInteger value2, String value3, Byte value4, String value5, Timestamp value6) { - value1(value1); - value2(value2); - value3(value3); - value4(value4); - value5(value5); - value6(value6); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IDataset from) { - setDid(from.getDid()); - setOwnerUid(from.getOwnerUid()); - setName(from.getName()); - setIsPublic(from.getIsPublic()); - setDescription(from.getDescription()); - setCreationTime(from.getCreationTime()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached DatasetRecord - */ - public DatasetRecord() { - super(Dataset.DATASET); - } - - /** - * Create a detached, initialised DatasetRecord - */ - public DatasetRecord(UInteger did, UInteger ownerUid, String name, Byte isPublic, String description, Timestamp creationTime) { - super(Dataset.DATASET); - - set(0, did); - set(1, ownerUid); - set(2, name); - set(3, isPublic); - set(4, description); - set(5, creationTime); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetUserAccessRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetUserAccessRecord.java deleted file mode 100644 index c779fb0556f..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetUserAccessRecord.java +++ /dev/null @@ -1,206 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.DatasetUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.DatasetUserAccess; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IDatasetUserAccess; -import org.jooq.Field; -import org.jooq.Record2; -import org.jooq.Record3; -import org.jooq.Row3; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetUserAccessRecord extends UpdatableRecordImpl implements Record3, IDatasetUserAccess { - - private static final long serialVersionUID = -417509378; - - /** - * Setter for texera_db.dataset_user_access.did. - */ - @Override - public void setDid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.dataset_user_access.did. - */ - @Override - public UInteger getDid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.dataset_user_access.uid. - */ - @Override - public void setUid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.dataset_user_access.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(1); - } - - /** - * Setter for texera_db.dataset_user_access.privilege. - */ - @Override - public void setPrivilege(DatasetUserAccessPrivilege value) { - set(2, value); - } - - /** - * Getter for texera_db.dataset_user_access.privilege. - */ - @Override - public DatasetUserAccessPrivilege getPrivilege() { - return (DatasetUserAccessPrivilege) get(2); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record2 key() { - return (Record2) super.key(); - } - - // ------------------------------------------------------------------------- - // Record3 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row3 fieldsRow() { - return (Row3) super.fieldsRow(); - } - - @Override - public Row3 valuesRow() { - return (Row3) super.valuesRow(); - } - - @Override - public Field field1() { - return DatasetUserAccess.DATASET_USER_ACCESS.DID; - } - - @Override - public Field field2() { - return DatasetUserAccess.DATASET_USER_ACCESS.UID; - } - - @Override - public Field field3() { - return DatasetUserAccess.DATASET_USER_ACCESS.PRIVILEGE; - } - - @Override - public UInteger component1() { - return getDid(); - } - - @Override - public UInteger component2() { - return getUid(); - } - - @Override - public DatasetUserAccessPrivilege component3() { - return getPrivilege(); - } - - @Override - public UInteger value1() { - return getDid(); - } - - @Override - public UInteger value2() { - return getUid(); - } - - @Override - public DatasetUserAccessPrivilege value3() { - return getPrivilege(); - } - - @Override - public DatasetUserAccessRecord value1(UInteger value) { - setDid(value); - return this; - } - - @Override - public DatasetUserAccessRecord value2(UInteger value) { - setUid(value); - return this; - } - - @Override - public DatasetUserAccessRecord value3(DatasetUserAccessPrivilege value) { - setPrivilege(value); - return this; - } - - @Override - public DatasetUserAccessRecord values(UInteger value1, UInteger value2, DatasetUserAccessPrivilege value3) { - value1(value1); - value2(value2); - value3(value3); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IDatasetUserAccess from) { - setDid(from.getDid()); - setUid(from.getUid()); - setPrivilege(from.getPrivilege()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached DatasetUserAccessRecord - */ - public DatasetUserAccessRecord() { - super(DatasetUserAccess.DATASET_USER_ACCESS); - } - - /** - * Create a detached, initialised DatasetUserAccessRecord - */ - public DatasetUserAccessRecord(UInteger did, UInteger uid, DatasetUserAccessPrivilege privilege) { - super(DatasetUserAccess.DATASET_USER_ACCESS); - - set(0, did); - set(1, uid); - set(2, privilege); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetVersionRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetVersionRecord.java deleted file mode 100644 index fa0ce6fb35c..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/DatasetVersionRecord.java +++ /dev/null @@ -1,327 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.DatasetVersion; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IDatasetVersion; -import org.jooq.Field; -import org.jooq.Record1; -import org.jooq.Record6; -import org.jooq.Row6; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class DatasetVersionRecord extends UpdatableRecordImpl implements Record6, IDatasetVersion { - - private static final long serialVersionUID = -870558780; - - /** - * Setter for texera_db.dataset_version.dvid. - */ - @Override - public void setDvid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.dataset_version.dvid. - */ - @Override - public UInteger getDvid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.dataset_version.did. - */ - @Override - public void setDid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.dataset_version.did. - */ - @Override - public UInteger getDid() { - return (UInteger) get(1); - } - - /** - * Setter for texera_db.dataset_version.creator_uid. - */ - @Override - public void setCreatorUid(UInteger value) { - set(2, value); - } - - /** - * Getter for texera_db.dataset_version.creator_uid. - */ - @Override - public UInteger getCreatorUid() { - return (UInteger) get(2); - } - - /** - * Setter for texera_db.dataset_version.name. - */ - @Override - public void setName(String value) { - set(3, value); - } - - /** - * Getter for texera_db.dataset_version.name. - */ - @Override - public String getName() { - return (String) get(3); - } - - /** - * Setter for texera_db.dataset_version.version_hash. - */ - @Override - public void setVersionHash(String value) { - set(4, value); - } - - /** - * Getter for texera_db.dataset_version.version_hash. - */ - @Override - public String getVersionHash() { - return (String) get(4); - } - - /** - * Setter for texera_db.dataset_version.creation_time. - */ - @Override - public void setCreationTime(Timestamp value) { - set(5, value); - } - - /** - * Getter for texera_db.dataset_version.creation_time. - */ - @Override - public Timestamp getCreationTime() { - return (Timestamp) get(5); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record1 key() { - return (Record1) super.key(); - } - - // ------------------------------------------------------------------------- - // Record6 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row6 fieldsRow() { - return (Row6) super.fieldsRow(); - } - - @Override - public Row6 valuesRow() { - return (Row6) super.valuesRow(); - } - - @Override - public Field field1() { - return DatasetVersion.DATASET_VERSION.DVID; - } - - @Override - public Field field2() { - return DatasetVersion.DATASET_VERSION.DID; - } - - @Override - public Field field3() { - return DatasetVersion.DATASET_VERSION.CREATOR_UID; - } - - @Override - public Field field4() { - return DatasetVersion.DATASET_VERSION.NAME; - } - - @Override - public Field field5() { - return DatasetVersion.DATASET_VERSION.VERSION_HASH; - } - - @Override - public Field field6() { - return DatasetVersion.DATASET_VERSION.CREATION_TIME; - } - - @Override - public UInteger component1() { - return getDvid(); - } - - @Override - public UInteger component2() { - return getDid(); - } - - @Override - public UInteger component3() { - return getCreatorUid(); - } - - @Override - public String component4() { - return getName(); - } - - @Override - public String component5() { - return getVersionHash(); - } - - @Override - public Timestamp component6() { - return getCreationTime(); - } - - @Override - public UInteger value1() { - return getDvid(); - } - - @Override - public UInteger value2() { - return getDid(); - } - - @Override - public UInteger value3() { - return getCreatorUid(); - } - - @Override - public String value4() { - return getName(); - } - - @Override - public String value5() { - return getVersionHash(); - } - - @Override - public Timestamp value6() { - return getCreationTime(); - } - - @Override - public DatasetVersionRecord value1(UInteger value) { - setDvid(value); - return this; - } - - @Override - public DatasetVersionRecord value2(UInteger value) { - setDid(value); - return this; - } - - @Override - public DatasetVersionRecord value3(UInteger value) { - setCreatorUid(value); - return this; - } - - @Override - public DatasetVersionRecord value4(String value) { - setName(value); - return this; - } - - @Override - public DatasetVersionRecord value5(String value) { - setVersionHash(value); - return this; - } - - @Override - public DatasetVersionRecord value6(Timestamp value) { - setCreationTime(value); - return this; - } - - @Override - public DatasetVersionRecord values(UInteger value1, UInteger value2, UInteger value3, String value4, String value5, Timestamp value6) { - value1(value1); - value2(value2); - value3(value3); - value4(value4); - value5(value5); - value6(value6); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IDatasetVersion from) { - setDvid(from.getDvid()); - setDid(from.getDid()); - setCreatorUid(from.getCreatorUid()); - setName(from.getName()); - setVersionHash(from.getVersionHash()); - setCreationTime(from.getCreationTime()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached DatasetVersionRecord - */ - public DatasetVersionRecord() { - super(DatasetVersion.DATASET_VERSION); - } - - /** - * Create a detached, initialised DatasetVersionRecord - */ - public DatasetVersionRecord(UInteger dvid, UInteger did, UInteger creatorUid, String name, String versionHash, Timestamp creationTime) { - super(DatasetVersion.DATASET_VERSION); - - set(0, dvid); - set(1, did); - set(2, creatorUid); - set(3, name); - set(4, versionHash); - set(5, creationTime); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/ProjectRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/ProjectRecord.java deleted file mode 100644 index c483a9a1bbe..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/ProjectRecord.java +++ /dev/null @@ -1,327 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.Project; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IProject; -import org.jooq.Field; -import org.jooq.Record1; -import org.jooq.Record6; -import org.jooq.Row6; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class ProjectRecord extends UpdatableRecordImpl implements Record6, IProject { - - private static final long serialVersionUID = -787882699; - - /** - * Setter for texera_db.project.pid. - */ - @Override - public void setPid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.project.pid. - */ - @Override - public UInteger getPid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.project.name. - */ - @Override - public void setName(String value) { - set(1, value); - } - - /** - * Getter for texera_db.project.name. - */ - @Override - public String getName() { - return (String) get(1); - } - - /** - * Setter for texera_db.project.description. - */ - @Override - public void setDescription(String value) { - set(2, value); - } - - /** - * Getter for texera_db.project.description. - */ - @Override - public String getDescription() { - return (String) get(2); - } - - /** - * Setter for texera_db.project.owner_id. - */ - @Override - public void setOwnerId(UInteger value) { - set(3, value); - } - - /** - * Getter for texera_db.project.owner_id. - */ - @Override - public UInteger getOwnerId() { - return (UInteger) get(3); - } - - /** - * Setter for texera_db.project.creation_time. - */ - @Override - public void setCreationTime(Timestamp value) { - set(4, value); - } - - /** - * Getter for texera_db.project.creation_time. - */ - @Override - public Timestamp getCreationTime() { - return (Timestamp) get(4); - } - - /** - * Setter for texera_db.project.color. - */ - @Override - public void setColor(String value) { - set(5, value); - } - - /** - * Getter for texera_db.project.color. - */ - @Override - public String getColor() { - return (String) get(5); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record1 key() { - return (Record1) super.key(); - } - - // ------------------------------------------------------------------------- - // Record6 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row6 fieldsRow() { - return (Row6) super.fieldsRow(); - } - - @Override - public Row6 valuesRow() { - return (Row6) super.valuesRow(); - } - - @Override - public Field field1() { - return Project.PROJECT.PID; - } - - @Override - public Field field2() { - return Project.PROJECT.NAME; - } - - @Override - public Field field3() { - return Project.PROJECT.DESCRIPTION; - } - - @Override - public Field field4() { - return Project.PROJECT.OWNER_ID; - } - - @Override - public Field field5() { - return Project.PROJECT.CREATION_TIME; - } - - @Override - public Field field6() { - return Project.PROJECT.COLOR; - } - - @Override - public UInteger component1() { - return getPid(); - } - - @Override - public String component2() { - return getName(); - } - - @Override - public String component3() { - return getDescription(); - } - - @Override - public UInteger component4() { - return getOwnerId(); - } - - @Override - public Timestamp component5() { - return getCreationTime(); - } - - @Override - public String component6() { - return getColor(); - } - - @Override - public UInteger value1() { - return getPid(); - } - - @Override - public String value2() { - return getName(); - } - - @Override - public String value3() { - return getDescription(); - } - - @Override - public UInteger value4() { - return getOwnerId(); - } - - @Override - public Timestamp value5() { - return getCreationTime(); - } - - @Override - public String value6() { - return getColor(); - } - - @Override - public ProjectRecord value1(UInteger value) { - setPid(value); - return this; - } - - @Override - public ProjectRecord value2(String value) { - setName(value); - return this; - } - - @Override - public ProjectRecord value3(String value) { - setDescription(value); - return this; - } - - @Override - public ProjectRecord value4(UInteger value) { - setOwnerId(value); - return this; - } - - @Override - public ProjectRecord value5(Timestamp value) { - setCreationTime(value); - return this; - } - - @Override - public ProjectRecord value6(String value) { - setColor(value); - return this; - } - - @Override - public ProjectRecord values(UInteger value1, String value2, String value3, UInteger value4, Timestamp value5, String value6) { - value1(value1); - value2(value2); - value3(value3); - value4(value4); - value5(value5); - value6(value6); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IProject from) { - setPid(from.getPid()); - setName(from.getName()); - setDescription(from.getDescription()); - setOwnerId(from.getOwnerId()); - setCreationTime(from.getCreationTime()); - setColor(from.getColor()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached ProjectRecord - */ - public ProjectRecord() { - super(Project.PROJECT); - } - - /** - * Create a detached, initialised ProjectRecord - */ - public ProjectRecord(UInteger pid, String name, String description, UInteger ownerId, Timestamp creationTime, String color) { - super(Project.PROJECT); - - set(0, pid); - set(1, name); - set(2, description); - set(3, ownerId); - set(4, creationTime); - set(5, color); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/ProjectUserAccessRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/ProjectUserAccessRecord.java deleted file mode 100644 index 8e9311c293c..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/ProjectUserAccessRecord.java +++ /dev/null @@ -1,206 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.ProjectUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.ProjectUserAccess; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IProjectUserAccess; -import org.jooq.Field; -import org.jooq.Record2; -import org.jooq.Record3; -import org.jooq.Row3; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class ProjectUserAccessRecord extends UpdatableRecordImpl implements Record3, IProjectUserAccess { - - private static final long serialVersionUID = -1549467390; - - /** - * Setter for texera_db.project_user_access.uid. - */ - @Override - public void setUid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.project_user_access.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.project_user_access.pid. - */ - @Override - public void setPid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.project_user_access.pid. - */ - @Override - public UInteger getPid() { - return (UInteger) get(1); - } - - /** - * Setter for texera_db.project_user_access.privilege. - */ - @Override - public void setPrivilege(ProjectUserAccessPrivilege value) { - set(2, value); - } - - /** - * Getter for texera_db.project_user_access.privilege. - */ - @Override - public ProjectUserAccessPrivilege getPrivilege() { - return (ProjectUserAccessPrivilege) get(2); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record2 key() { - return (Record2) super.key(); - } - - // ------------------------------------------------------------------------- - // Record3 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row3 fieldsRow() { - return (Row3) super.fieldsRow(); - } - - @Override - public Row3 valuesRow() { - return (Row3) super.valuesRow(); - } - - @Override - public Field field1() { - return ProjectUserAccess.PROJECT_USER_ACCESS.UID; - } - - @Override - public Field field2() { - return ProjectUserAccess.PROJECT_USER_ACCESS.PID; - } - - @Override - public Field field3() { - return ProjectUserAccess.PROJECT_USER_ACCESS.PRIVILEGE; - } - - @Override - public UInteger component1() { - return getUid(); - } - - @Override - public UInteger component2() { - return getPid(); - } - - @Override - public ProjectUserAccessPrivilege component3() { - return getPrivilege(); - } - - @Override - public UInteger value1() { - return getUid(); - } - - @Override - public UInteger value2() { - return getPid(); - } - - @Override - public ProjectUserAccessPrivilege value3() { - return getPrivilege(); - } - - @Override - public ProjectUserAccessRecord value1(UInteger value) { - setUid(value); - return this; - } - - @Override - public ProjectUserAccessRecord value2(UInteger value) { - setPid(value); - return this; - } - - @Override - public ProjectUserAccessRecord value3(ProjectUserAccessPrivilege value) { - setPrivilege(value); - return this; - } - - @Override - public ProjectUserAccessRecord values(UInteger value1, UInteger value2, ProjectUserAccessPrivilege value3) { - value1(value1); - value2(value2); - value3(value3); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IProjectUserAccess from) { - setUid(from.getUid()); - setPid(from.getPid()); - setPrivilege(from.getPrivilege()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached ProjectUserAccessRecord - */ - public ProjectUserAccessRecord() { - super(ProjectUserAccess.PROJECT_USER_ACCESS); - } - - /** - * Create a detached, initialised ProjectUserAccessRecord - */ - public ProjectUserAccessRecord(UInteger uid, UInteger pid, ProjectUserAccessPrivilege privilege) { - super(ProjectUserAccess.PROJECT_USER_ACCESS); - - set(0, uid); - set(1, pid); - set(2, privilege); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/PublicProjectRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/PublicProjectRecord.java deleted file mode 100644 index b017081d193..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/PublicProjectRecord.java +++ /dev/null @@ -1,165 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.PublicProject; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IPublicProject; -import org.jooq.Field; -import org.jooq.Record1; -import org.jooq.Record2; -import org.jooq.Row2; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class PublicProjectRecord extends UpdatableRecordImpl implements Record2, IPublicProject { - - private static final long serialVersionUID = 89759719; - - /** - * Setter for texera_db.public_project.pid. - */ - @Override - public void setPid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.public_project.pid. - */ - @Override - public UInteger getPid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.public_project.uid. - */ - @Override - public void setUid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.public_project.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(1); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record1 key() { - return (Record1) super.key(); - } - - // ------------------------------------------------------------------------- - // Record2 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } - - @Override - public Row2 valuesRow() { - return (Row2) super.valuesRow(); - } - - @Override - public Field field1() { - return PublicProject.PUBLIC_PROJECT.PID; - } - - @Override - public Field field2() { - return PublicProject.PUBLIC_PROJECT.UID; - } - - @Override - public UInteger component1() { - return getPid(); - } - - @Override - public UInteger component2() { - return getUid(); - } - - @Override - public UInteger value1() { - return getPid(); - } - - @Override - public UInteger value2() { - return getUid(); - } - - @Override - public PublicProjectRecord value1(UInteger value) { - setPid(value); - return this; - } - - @Override - public PublicProjectRecord value2(UInteger value) { - setUid(value); - return this; - } - - @Override - public PublicProjectRecord values(UInteger value1, UInteger value2) { - value1(value1); - value2(value2); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IPublicProject from) { - setPid(from.getPid()); - setUid(from.getUid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached PublicProjectRecord - */ - public PublicProjectRecord() { - super(PublicProject.PUBLIC_PROJECT); - } - - /** - * Create a detached, initialised PublicProjectRecord - */ - public PublicProjectRecord(UInteger pid, UInteger uid) { - super(PublicProject.PUBLIC_PROJECT); - - set(0, pid); - set(1, uid); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/UserConfigRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/UserConfigRecord.java deleted file mode 100644 index 6f6e7637e33..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/UserConfigRecord.java +++ /dev/null @@ -1,205 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.UserConfig; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IUserConfig; -import org.jooq.Field; -import org.jooq.Record2; -import org.jooq.Record3; -import org.jooq.Row3; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class UserConfigRecord extends UpdatableRecordImpl implements Record3, IUserConfig { - - private static final long serialVersionUID = 1371414609; - - /** - * Setter for texera_db.user_config.uid. - */ - @Override - public void setUid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.user_config.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.user_config.key. - */ - @Override - public void setKey(String value) { - set(1, value); - } - - /** - * Getter for texera_db.user_config.key. - */ - @Override - public String getKey() { - return (String) get(1); - } - - /** - * Setter for texera_db.user_config.value. - */ - @Override - public void setValue(String value) { - set(2, value); - } - - /** - * Getter for texera_db.user_config.value. - */ - @Override - public String getValue() { - return (String) get(2); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record2 key() { - return (Record2) super.key(); - } - - // ------------------------------------------------------------------------- - // Record3 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row3 fieldsRow() { - return (Row3) super.fieldsRow(); - } - - @Override - public Row3 valuesRow() { - return (Row3) super.valuesRow(); - } - - @Override - public Field field1() { - return UserConfig.USER_CONFIG.UID; - } - - @Override - public Field field2() { - return UserConfig.USER_CONFIG.KEY; - } - - @Override - public Field field3() { - return UserConfig.USER_CONFIG.VALUE; - } - - @Override - public UInteger component1() { - return getUid(); - } - - @Override - public String component2() { - return getKey(); - } - - @Override - public String component3() { - return getValue(); - } - - @Override - public UInteger value1() { - return getUid(); - } - - @Override - public String value2() { - return getKey(); - } - - @Override - public String value3() { - return getValue(); - } - - @Override - public UserConfigRecord value1(UInteger value) { - setUid(value); - return this; - } - - @Override - public UserConfigRecord value2(String value) { - setKey(value); - return this; - } - - @Override - public UserConfigRecord value3(String value) { - setValue(value); - return this; - } - - @Override - public UserConfigRecord values(UInteger value1, String value2, String value3) { - value1(value1); - value2(value2); - value3(value3); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IUserConfig from) { - setUid(from.getUid()); - setKey(from.getKey()); - setValue(from.getValue()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached UserConfigRecord - */ - public UserConfigRecord() { - super(UserConfig.USER_CONFIG); - } - - /** - * Create a detached, initialised UserConfigRecord - */ - public UserConfigRecord(UInteger uid, String key, String value) { - super(UserConfig.USER_CONFIG); - - set(0, uid); - set(1, key); - set(2, value); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/UserRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/UserRecord.java deleted file mode 100644 index a361a50291f..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/UserRecord.java +++ /dev/null @@ -1,366 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole; -import edu.uci.ics.texera.web.model.jooq.generated.tables.User; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IUser; -import org.jooq.Field; -import org.jooq.Record1; -import org.jooq.Record7; -import org.jooq.Row7; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class UserRecord extends UpdatableRecordImpl implements Record7, IUser { - - private static final long serialVersionUID = -360916281; - - /** - * Setter for texera_db.user.uid. - */ - @Override - public void setUid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.user.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.user.name. - */ - @Override - public void setName(String value) { - set(1, value); - } - - /** - * Getter for texera_db.user.name. - */ - @Override - public String getName() { - return (String) get(1); - } - - /** - * Setter for texera_db.user.email. - */ - @Override - public void setEmail(String value) { - set(2, value); - } - - /** - * Getter for texera_db.user.email. - */ - @Override - public String getEmail() { - return (String) get(2); - } - - /** - * Setter for texera_db.user.password. - */ - @Override - public void setPassword(String value) { - set(3, value); - } - - /** - * Getter for texera_db.user.password. - */ - @Override - public String getPassword() { - return (String) get(3); - } - - /** - * Setter for texera_db.user.google_id. - */ - @Override - public void setGoogleId(String value) { - set(4, value); - } - - /** - * Getter for texera_db.user.google_id. - */ - @Override - public String getGoogleId() { - return (String) get(4); - } - - /** - * Setter for texera_db.user.role. - */ - @Override - public void setRole(UserRole value) { - set(5, value); - } - - /** - * Getter for texera_db.user.role. - */ - @Override - public UserRole getRole() { - return (UserRole) get(5); - } - - /** - * Setter for texera_db.user.google_avatar. - */ - @Override - public void setGoogleAvatar(String value) { - set(6, value); - } - - /** - * Getter for texera_db.user.google_avatar. - */ - @Override - public String getGoogleAvatar() { - return (String) get(6); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record1 key() { - return (Record1) super.key(); - } - - // ------------------------------------------------------------------------- - // Record7 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row7 fieldsRow() { - return (Row7) super.fieldsRow(); - } - - @Override - public Row7 valuesRow() { - return (Row7) super.valuesRow(); - } - - @Override - public Field field1() { - return User.USER.UID; - } - - @Override - public Field field2() { - return User.USER.NAME; - } - - @Override - public Field field3() { - return User.USER.EMAIL; - } - - @Override - public Field field4() { - return User.USER.PASSWORD; - } - - @Override - public Field field5() { - return User.USER.GOOGLE_ID; - } - - @Override - public Field field6() { - return User.USER.ROLE; - } - - @Override - public Field field7() { - return User.USER.GOOGLE_AVATAR; - } - - @Override - public UInteger component1() { - return getUid(); - } - - @Override - public String component2() { - return getName(); - } - - @Override - public String component3() { - return getEmail(); - } - - @Override - public String component4() { - return getPassword(); - } - - @Override - public String component5() { - return getGoogleId(); - } - - @Override - public UserRole component6() { - return getRole(); - } - - @Override - public String component7() { - return getGoogleAvatar(); - } - - @Override - public UInteger value1() { - return getUid(); - } - - @Override - public String value2() { - return getName(); - } - - @Override - public String value3() { - return getEmail(); - } - - @Override - public String value4() { - return getPassword(); - } - - @Override - public String value5() { - return getGoogleId(); - } - - @Override - public UserRole value6() { - return getRole(); - } - - @Override - public String value7() { - return getGoogleAvatar(); - } - - @Override - public UserRecord value1(UInteger value) { - setUid(value); - return this; - } - - @Override - public UserRecord value2(String value) { - setName(value); - return this; - } - - @Override - public UserRecord value3(String value) { - setEmail(value); - return this; - } - - @Override - public UserRecord value4(String value) { - setPassword(value); - return this; - } - - @Override - public UserRecord value5(String value) { - setGoogleId(value); - return this; - } - - @Override - public UserRecord value6(UserRole value) { - setRole(value); - return this; - } - - @Override - public UserRecord value7(String value) { - setGoogleAvatar(value); - return this; - } - - @Override - public UserRecord values(UInteger value1, String value2, String value3, String value4, String value5, UserRole value6, String value7) { - value1(value1); - value2(value2); - value3(value3); - value4(value4); - value5(value5); - value6(value6); - value7(value7); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IUser from) { - setUid(from.getUid()); - setName(from.getName()); - setEmail(from.getEmail()); - setPassword(from.getPassword()); - setGoogleId(from.getGoogleId()); - setRole(from.getRole()); - setGoogleAvatar(from.getGoogleAvatar()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached UserRecord - */ - public UserRecord() { - super(User.USER); - } - - /** - * Create a detached, initialised UserRecord - */ - public UserRecord(UInteger uid, String name, String email, String password, String googleId, UserRole role, String googleAvatar) { - super(User.USER); - - set(0, uid); - set(1, name); - set(2, email); - set(3, password); - set(4, googleId); - set(5, role); - set(6, googleAvatar); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowExecutionsRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowExecutionsRecord.java deleted file mode 100644 index db1251cbce1..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowExecutionsRecord.java +++ /dev/null @@ -1,527 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowExecutions; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowExecutions; -import org.jooq.Field; -import org.jooq.Record1; -import org.jooq.Record11; -import org.jooq.Row11; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowExecutionsRecord extends UpdatableRecordImpl implements Record11, IWorkflowExecutions { - - private static final long serialVersionUID = 1943572236; - - /** - * Setter for texera_db.workflow_executions.eid. - */ - @Override - public void setEid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_executions.eid. - */ - @Override - public UInteger getEid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_executions.vid. - */ - @Override - public void setVid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_executions.vid. - */ - @Override - public UInteger getVid() { - return (UInteger) get(1); - } - - /** - * Setter for texera_db.workflow_executions.uid. - */ - @Override - public void setUid(UInteger value) { - set(2, value); - } - - /** - * Getter for texera_db.workflow_executions.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(2); - } - - /** - * Setter for texera_db.workflow_executions.status. - */ - @Override - public void setStatus(Byte value) { - set(3, value); - } - - /** - * Getter for texera_db.workflow_executions.status. - */ - @Override - public Byte getStatus() { - return (Byte) get(3); - } - - /** - * Setter for texera_db.workflow_executions.result. - */ - @Override - public void setResult(String value) { - set(4, value); - } - - /** - * Getter for texera_db.workflow_executions.result. - */ - @Override - public String getResult() { - return (String) get(4); - } - - /** - * Setter for texera_db.workflow_executions.starting_time. - */ - @Override - public void setStartingTime(Timestamp value) { - set(5, value); - } - - /** - * Getter for texera_db.workflow_executions.starting_time. - */ - @Override - public Timestamp getStartingTime() { - return (Timestamp) get(5); - } - - /** - * Setter for texera_db.workflow_executions.last_update_time. - */ - @Override - public void setLastUpdateTime(Timestamp value) { - set(6, value); - } - - /** - * Getter for texera_db.workflow_executions.last_update_time. - */ - @Override - public Timestamp getLastUpdateTime() { - return (Timestamp) get(6); - } - - /** - * Setter for texera_db.workflow_executions.bookmarked. - */ - @Override - public void setBookmarked(Byte value) { - set(7, value); - } - - /** - * Getter for texera_db.workflow_executions.bookmarked. - */ - @Override - public Byte getBookmarked() { - return (Byte) get(7); - } - - /** - * Setter for texera_db.workflow_executions.name. - */ - @Override - public void setName(String value) { - set(8, value); - } - - /** - * Getter for texera_db.workflow_executions.name. - */ - @Override - public String getName() { - return (String) get(8); - } - - /** - * Setter for texera_db.workflow_executions.environment_version. - */ - @Override - public void setEnvironmentVersion(String value) { - set(9, value); - } - - /** - * Getter for texera_db.workflow_executions.environment_version. - */ - @Override - public String getEnvironmentVersion() { - return (String) get(9); - } - - /** - * Setter for texera_db.workflow_executions.log_location. - */ - @Override - public void setLogLocation(String value) { - set(10, value); - } - - /** - * Getter for texera_db.workflow_executions.log_location. - */ - @Override - public String getLogLocation() { - return (String) get(10); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record1 key() { - return (Record1) super.key(); - } - - // ------------------------------------------------------------------------- - // Record11 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row11 fieldsRow() { - return (Row11) super.fieldsRow(); - } - - @Override - public Row11 valuesRow() { - return (Row11) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.EID; - } - - @Override - public Field field2() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.VID; - } - - @Override - public Field field3() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.UID; - } - - @Override - public Field field4() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.STATUS; - } - - @Override - public Field field5() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.RESULT; - } - - @Override - public Field field6() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.STARTING_TIME; - } - - @Override - public Field field7() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.LAST_UPDATE_TIME; - } - - @Override - public Field field8() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.BOOKMARKED; - } - - @Override - public Field field9() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.NAME; - } - - @Override - public Field field10() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.ENVIRONMENT_VERSION; - } - - @Override - public Field field11() { - return WorkflowExecutions.WORKFLOW_EXECUTIONS.LOG_LOCATION; - } - - @Override - public UInteger component1() { - return getEid(); - } - - @Override - public UInteger component2() { - return getVid(); - } - - @Override - public UInteger component3() { - return getUid(); - } - - @Override - public Byte component4() { - return getStatus(); - } - - @Override - public String component5() { - return getResult(); - } - - @Override - public Timestamp component6() { - return getStartingTime(); - } - - @Override - public Timestamp component7() { - return getLastUpdateTime(); - } - - @Override - public Byte component8() { - return getBookmarked(); - } - - @Override - public String component9() { - return getName(); - } - - @Override - public String component10() { - return getEnvironmentVersion(); - } - - @Override - public String component11() { - return getLogLocation(); - } - - @Override - public UInteger value1() { - return getEid(); - } - - @Override - public UInteger value2() { - return getVid(); - } - - @Override - public UInteger value3() { - return getUid(); - } - - @Override - public Byte value4() { - return getStatus(); - } - - @Override - public String value5() { - return getResult(); - } - - @Override - public Timestamp value6() { - return getStartingTime(); - } - - @Override - public Timestamp value7() { - return getLastUpdateTime(); - } - - @Override - public Byte value8() { - return getBookmarked(); - } - - @Override - public String value9() { - return getName(); - } - - @Override - public String value10() { - return getEnvironmentVersion(); - } - - @Override - public String value11() { - return getLogLocation(); - } - - @Override - public WorkflowExecutionsRecord value1(UInteger value) { - setEid(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value2(UInteger value) { - setVid(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value3(UInteger value) { - setUid(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value4(Byte value) { - setStatus(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value5(String value) { - setResult(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value6(Timestamp value) { - setStartingTime(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value7(Timestamp value) { - setLastUpdateTime(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value8(Byte value) { - setBookmarked(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value9(String value) { - setName(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value10(String value) { - setEnvironmentVersion(value); - return this; - } - - @Override - public WorkflowExecutionsRecord value11(String value) { - setLogLocation(value); - return this; - } - - @Override - public WorkflowExecutionsRecord values(UInteger value1, UInteger value2, UInteger value3, Byte value4, String value5, Timestamp value6, Timestamp value7, Byte value8, String value9, String value10, String value11) { - value1(value1); - value2(value2); - value3(value3); - value4(value4); - value5(value5); - value6(value6); - value7(value7); - value8(value8); - value9(value9); - value10(value10); - value11(value11); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowExecutions from) { - setEid(from.getEid()); - setVid(from.getVid()); - setUid(from.getUid()); - setStatus(from.getStatus()); - setResult(from.getResult()); - setStartingTime(from.getStartingTime()); - setLastUpdateTime(from.getLastUpdateTime()); - setBookmarked(from.getBookmarked()); - setName(from.getName()); - setEnvironmentVersion(from.getEnvironmentVersion()); - setLogLocation(from.getLogLocation()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowExecutionsRecord - */ - public WorkflowExecutionsRecord() { - super(WorkflowExecutions.WORKFLOW_EXECUTIONS); - } - - /** - * Create a detached, initialised WorkflowExecutionsRecord - */ - public WorkflowExecutionsRecord(UInteger eid, UInteger vid, UInteger uid, Byte status, String result, Timestamp startingTime, Timestamp lastUpdateTime, Byte bookmarked, String name, String environmentVersion, String logLocation) { - super(WorkflowExecutions.WORKFLOW_EXECUTIONS); - - set(0, eid); - set(1, vid); - set(2, uid); - set(3, status); - set(4, result); - set(5, startingTime); - set(6, lastUpdateTime); - set(7, bookmarked); - set(8, name); - set(9, environmentVersion); - set(10, logLocation); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowOfProjectRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowOfProjectRecord.java deleted file mode 100644 index b3deecb03a1..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowOfProjectRecord.java +++ /dev/null @@ -1,164 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowOfProject; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowOfProject; -import org.jooq.Field; -import org.jooq.Record2; -import org.jooq.Row2; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowOfProjectRecord extends UpdatableRecordImpl implements Record2, IWorkflowOfProject { - - private static final long serialVersionUID = 919631497; - - /** - * Setter for texera_db.workflow_of_project.wid. - */ - @Override - public void setWid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_of_project.wid. - */ - @Override - public UInteger getWid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_of_project.pid. - */ - @Override - public void setPid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_of_project.pid. - */ - @Override - public UInteger getPid() { - return (UInteger) get(1); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record2 key() { - return (Record2) super.key(); - } - - // ------------------------------------------------------------------------- - // Record2 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } - - @Override - public Row2 valuesRow() { - return (Row2) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowOfProject.WORKFLOW_OF_PROJECT.WID; - } - - @Override - public Field field2() { - return WorkflowOfProject.WORKFLOW_OF_PROJECT.PID; - } - - @Override - public UInteger component1() { - return getWid(); - } - - @Override - public UInteger component2() { - return getPid(); - } - - @Override - public UInteger value1() { - return getWid(); - } - - @Override - public UInteger value2() { - return getPid(); - } - - @Override - public WorkflowOfProjectRecord value1(UInteger value) { - setWid(value); - return this; - } - - @Override - public WorkflowOfProjectRecord value2(UInteger value) { - setPid(value); - return this; - } - - @Override - public WorkflowOfProjectRecord values(UInteger value1, UInteger value2) { - value1(value1); - value2(value2); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowOfProject from) { - setWid(from.getWid()); - setPid(from.getPid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowOfProjectRecord - */ - public WorkflowOfProjectRecord() { - super(WorkflowOfProject.WORKFLOW_OF_PROJECT); - } - - /** - * Create a detached, initialised WorkflowOfProjectRecord - */ - public WorkflowOfProjectRecord(UInteger wid, UInteger pid) { - super(WorkflowOfProject.WORKFLOW_OF_PROJECT); - - set(0, wid); - set(1, pid); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowOfUserRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowOfUserRecord.java deleted file mode 100644 index 832a05fa073..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowOfUserRecord.java +++ /dev/null @@ -1,164 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowOfUser; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowOfUser; -import org.jooq.Field; -import org.jooq.Record2; -import org.jooq.Row2; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowOfUserRecord extends UpdatableRecordImpl implements Record2, IWorkflowOfUser { - - private static final long serialVersionUID = -189114339; - - /** - * Setter for texera_db.workflow_of_user.uid. - */ - @Override - public void setUid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_of_user.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_of_user.wid. - */ - @Override - public void setWid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_of_user.wid. - */ - @Override - public UInteger getWid() { - return (UInteger) get(1); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record2 key() { - return (Record2) super.key(); - } - - // ------------------------------------------------------------------------- - // Record2 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } - - @Override - public Row2 valuesRow() { - return (Row2) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowOfUser.WORKFLOW_OF_USER.UID; - } - - @Override - public Field field2() { - return WorkflowOfUser.WORKFLOW_OF_USER.WID; - } - - @Override - public UInteger component1() { - return getUid(); - } - - @Override - public UInteger component2() { - return getWid(); - } - - @Override - public UInteger value1() { - return getUid(); - } - - @Override - public UInteger value2() { - return getWid(); - } - - @Override - public WorkflowOfUserRecord value1(UInteger value) { - setUid(value); - return this; - } - - @Override - public WorkflowOfUserRecord value2(UInteger value) { - setWid(value); - return this; - } - - @Override - public WorkflowOfUserRecord values(UInteger value1, UInteger value2) { - value1(value1); - value2(value2); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowOfUser from) { - setUid(from.getUid()); - setWid(from.getWid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowOfUserRecord - */ - public WorkflowOfUserRecord() { - super(WorkflowOfUser.WORKFLOW_OF_USER); - } - - /** - * Create a detached, initialised WorkflowOfUserRecord - */ - public WorkflowOfUserRecord(UInteger uid, UInteger wid) { - super(WorkflowOfUser.WORKFLOW_OF_USER); - - set(0, uid); - set(1, wid); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowRecord.java deleted file mode 100644 index d46ecd2af29..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowRecord.java +++ /dev/null @@ -1,367 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.Workflow; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflow; -import org.jooq.Field; -import org.jooq.Record1; -import org.jooq.Record7; -import org.jooq.Row7; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowRecord extends UpdatableRecordImpl implements Record7, IWorkflow { - - private static final long serialVersionUID = 702670612; - - /** - * Setter for texera_db.workflow.name. - */ - @Override - public void setName(String value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow.name. - */ - @Override - public String getName() { - return (String) get(0); - } - - /** - * Setter for texera_db.workflow.description. - */ - @Override - public void setDescription(String value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow.description. - */ - @Override - public String getDescription() { - return (String) get(1); - } - - /** - * Setter for texera_db.workflow.wid. - */ - @Override - public void setWid(UInteger value) { - set(2, value); - } - - /** - * Getter for texera_db.workflow.wid. - */ - @Override - public UInteger getWid() { - return (UInteger) get(2); - } - - /** - * Setter for texera_db.workflow.content. - */ - @Override - public void setContent(String value) { - set(3, value); - } - - /** - * Getter for texera_db.workflow.content. - */ - @Override - public String getContent() { - return (String) get(3); - } - - /** - * Setter for texera_db.workflow.creation_time. - */ - @Override - public void setCreationTime(Timestamp value) { - set(4, value); - } - - /** - * Getter for texera_db.workflow.creation_time. - */ - @Override - public Timestamp getCreationTime() { - return (Timestamp) get(4); - } - - /** - * Setter for texera_db.workflow.last_modified_time. - */ - @Override - public void setLastModifiedTime(Timestamp value) { - set(5, value); - } - - /** - * Getter for texera_db.workflow.last_modified_time. - */ - @Override - public Timestamp getLastModifiedTime() { - return (Timestamp) get(5); - } - - /** - * Setter for texera_db.workflow.is_published. - */ - @Override - public void setIsPublished(Byte value) { - set(6, value); - } - - /** - * Getter for texera_db.workflow.is_published. - */ - @Override - public Byte getIsPublished() { - return (Byte) get(6); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record1 key() { - return (Record1) super.key(); - } - - // ------------------------------------------------------------------------- - // Record7 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row7 fieldsRow() { - return (Row7) super.fieldsRow(); - } - - @Override - public Row7 valuesRow() { - return (Row7) super.valuesRow(); - } - - @Override - public Field field1() { - return Workflow.WORKFLOW.NAME; - } - - @Override - public Field field2() { - return Workflow.WORKFLOW.DESCRIPTION; - } - - @Override - public Field field3() { - return Workflow.WORKFLOW.WID; - } - - @Override - public Field field4() { - return Workflow.WORKFLOW.CONTENT; - } - - @Override - public Field field5() { - return Workflow.WORKFLOW.CREATION_TIME; - } - - @Override - public Field field6() { - return Workflow.WORKFLOW.LAST_MODIFIED_TIME; - } - - @Override - public Field field7() { - return Workflow.WORKFLOW.IS_PUBLISHED; - } - - @Override - public String component1() { - return getName(); - } - - @Override - public String component2() { - return getDescription(); - } - - @Override - public UInteger component3() { - return getWid(); - } - - @Override - public String component4() { - return getContent(); - } - - @Override - public Timestamp component5() { - return getCreationTime(); - } - - @Override - public Timestamp component6() { - return getLastModifiedTime(); - } - - @Override - public Byte component7() { - return getIsPublished(); - } - - @Override - public String value1() { - return getName(); - } - - @Override - public String value2() { - return getDescription(); - } - - @Override - public UInteger value3() { - return getWid(); - } - - @Override - public String value4() { - return getContent(); - } - - @Override - public Timestamp value5() { - return getCreationTime(); - } - - @Override - public Timestamp value6() { - return getLastModifiedTime(); - } - - @Override - public Byte value7() { - return getIsPublished(); - } - - @Override - public WorkflowRecord value1(String value) { - setName(value); - return this; - } - - @Override - public WorkflowRecord value2(String value) { - setDescription(value); - return this; - } - - @Override - public WorkflowRecord value3(UInteger value) { - setWid(value); - return this; - } - - @Override - public WorkflowRecord value4(String value) { - setContent(value); - return this; - } - - @Override - public WorkflowRecord value5(Timestamp value) { - setCreationTime(value); - return this; - } - - @Override - public WorkflowRecord value6(Timestamp value) { - setLastModifiedTime(value); - return this; - } - - @Override - public WorkflowRecord value7(Byte value) { - setIsPublished(value); - return this; - } - - @Override - public WorkflowRecord values(String value1, String value2, UInteger value3, String value4, Timestamp value5, Timestamp value6, Byte value7) { - value1(value1); - value2(value2); - value3(value3); - value4(value4); - value5(value5); - value6(value6); - value7(value7); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflow from) { - setName(from.getName()); - setDescription(from.getDescription()); - setWid(from.getWid()); - setContent(from.getContent()); - setCreationTime(from.getCreationTime()); - setLastModifiedTime(from.getLastModifiedTime()); - setIsPublished(from.getIsPublished()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowRecord - */ - public WorkflowRecord() { - super(Workflow.WORKFLOW); - } - - /** - * Create a detached, initialised WorkflowRecord - */ - public WorkflowRecord(String name, String description, UInteger wid, String content, Timestamp creationTime, Timestamp lastModifiedTime, Byte isPublished) { - super(Workflow.WORKFLOW); - - set(0, name); - set(1, description); - set(2, wid); - set(3, content); - set(4, creationTime); - set(5, lastModifiedTime); - set(6, isPublished); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowRuntimeStatisticsRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowRuntimeStatisticsRecord.java deleted file mode 100644 index cbd5a9066d7..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowRuntimeStatisticsRecord.java +++ /dev/null @@ -1,528 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowRuntimeStatistics; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowRuntimeStatistics; -import org.jooq.Field; -import org.jooq.Record11; -import org.jooq.Record4; -import org.jooq.Row11; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; -import org.jooq.types.ULong; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowRuntimeStatisticsRecord extends UpdatableRecordImpl implements Record11, IWorkflowRuntimeStatistics { - - private static final long serialVersionUID = 367945669; - - /** - * Setter for texera_db.workflow_runtime_statistics.workflow_id. - */ - @Override - public void setWorkflowId(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.workflow_id. - */ - @Override - public UInteger getWorkflowId() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.execution_id. - */ - @Override - public void setExecutionId(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.execution_id. - */ - @Override - public UInteger getExecutionId() { - return (UInteger) get(1); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.operator_id. - */ - @Override - public void setOperatorId(String value) { - set(2, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.operator_id. - */ - @Override - public String getOperatorId() { - return (String) get(2); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.time. - */ - @Override - public void setTime(Timestamp value) { - set(3, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.time. - */ - @Override - public Timestamp getTime() { - return (Timestamp) get(3); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.input_tuple_cnt. - */ - @Override - public void setInputTupleCnt(UInteger value) { - set(4, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.input_tuple_cnt. - */ - @Override - public UInteger getInputTupleCnt() { - return (UInteger) get(4); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.output_tuple_cnt. - */ - @Override - public void setOutputTupleCnt(UInteger value) { - set(5, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.output_tuple_cnt. - */ - @Override - public UInteger getOutputTupleCnt() { - return (UInteger) get(5); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.status. - */ - @Override - public void setStatus(Byte value) { - set(6, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.status. - */ - @Override - public Byte getStatus() { - return (Byte) get(6); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.data_processing_time. - */ - @Override - public void setDataProcessingTime(ULong value) { - set(7, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.data_processing_time. - */ - @Override - public ULong getDataProcessingTime() { - return (ULong) get(7); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.control_processing_time. - */ - @Override - public void setControlProcessingTime(ULong value) { - set(8, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.control_processing_time. - */ - @Override - public ULong getControlProcessingTime() { - return (ULong) get(8); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.idle_time. - */ - @Override - public void setIdleTime(ULong value) { - set(9, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.idle_time. - */ - @Override - public ULong getIdleTime() { - return (ULong) get(9); - } - - /** - * Setter for texera_db.workflow_runtime_statistics.num_workers. - */ - @Override - public void setNumWorkers(UInteger value) { - set(10, value); - } - - /** - * Getter for texera_db.workflow_runtime_statistics.num_workers. - */ - @Override - public UInteger getNumWorkers() { - return (UInteger) get(10); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record4 key() { - return (Record4) super.key(); - } - - // ------------------------------------------------------------------------- - // Record11 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row11 fieldsRow() { - return (Row11) super.fieldsRow(); - } - - @Override - public Row11 valuesRow() { - return (Row11) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.WORKFLOW_ID; - } - - @Override - public Field field2() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.EXECUTION_ID; - } - - @Override - public Field field3() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.OPERATOR_ID; - } - - @Override - public Field field4() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.TIME; - } - - @Override - public Field field5() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.INPUT_TUPLE_CNT; - } - - @Override - public Field field6() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.OUTPUT_TUPLE_CNT; - } - - @Override - public Field field7() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.STATUS; - } - - @Override - public Field field8() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.DATA_PROCESSING_TIME; - } - - @Override - public Field field9() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.CONTROL_PROCESSING_TIME; - } - - @Override - public Field field10() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.IDLE_TIME; - } - - @Override - public Field field11() { - return WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS.NUM_WORKERS; - } - - @Override - public UInteger component1() { - return getWorkflowId(); - } - - @Override - public UInteger component2() { - return getExecutionId(); - } - - @Override - public String component3() { - return getOperatorId(); - } - - @Override - public Timestamp component4() { - return getTime(); - } - - @Override - public UInteger component5() { - return getInputTupleCnt(); - } - - @Override - public UInteger component6() { - return getOutputTupleCnt(); - } - - @Override - public Byte component7() { - return getStatus(); - } - - @Override - public ULong component8() { - return getDataProcessingTime(); - } - - @Override - public ULong component9() { - return getControlProcessingTime(); - } - - @Override - public ULong component10() { - return getIdleTime(); - } - - @Override - public UInteger component11() { - return getNumWorkers(); - } - - @Override - public UInteger value1() { - return getWorkflowId(); - } - - @Override - public UInteger value2() { - return getExecutionId(); - } - - @Override - public String value3() { - return getOperatorId(); - } - - @Override - public Timestamp value4() { - return getTime(); - } - - @Override - public UInteger value5() { - return getInputTupleCnt(); - } - - @Override - public UInteger value6() { - return getOutputTupleCnt(); - } - - @Override - public Byte value7() { - return getStatus(); - } - - @Override - public ULong value8() { - return getDataProcessingTime(); - } - - @Override - public ULong value9() { - return getControlProcessingTime(); - } - - @Override - public ULong value10() { - return getIdleTime(); - } - - @Override - public UInteger value11() { - return getNumWorkers(); - } - - @Override - public WorkflowRuntimeStatisticsRecord value1(UInteger value) { - setWorkflowId(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value2(UInteger value) { - setExecutionId(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value3(String value) { - setOperatorId(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value4(Timestamp value) { - setTime(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value5(UInteger value) { - setInputTupleCnt(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value6(UInteger value) { - setOutputTupleCnt(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value7(Byte value) { - setStatus(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value8(ULong value) { - setDataProcessingTime(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value9(ULong value) { - setControlProcessingTime(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value10(ULong value) { - setIdleTime(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord value11(UInteger value) { - setNumWorkers(value); - return this; - } - - @Override - public WorkflowRuntimeStatisticsRecord values(UInteger value1, UInteger value2, String value3, Timestamp value4, UInteger value5, UInteger value6, Byte value7, ULong value8, ULong value9, ULong value10, UInteger value11) { - value1(value1); - value2(value2); - value3(value3); - value4(value4); - value5(value5); - value6(value6); - value7(value7); - value8(value8); - value9(value9); - value10(value10); - value11(value11); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowRuntimeStatistics from) { - setWorkflowId(from.getWorkflowId()); - setExecutionId(from.getExecutionId()); - setOperatorId(from.getOperatorId()); - setTime(from.getTime()); - setInputTupleCnt(from.getInputTupleCnt()); - setOutputTupleCnt(from.getOutputTupleCnt()); - setStatus(from.getStatus()); - setDataProcessingTime(from.getDataProcessingTime()); - setControlProcessingTime(from.getControlProcessingTime()); - setIdleTime(from.getIdleTime()); - setNumWorkers(from.getNumWorkers()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowRuntimeStatisticsRecord - */ - public WorkflowRuntimeStatisticsRecord() { - super(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS); - } - - /** - * Create a detached, initialised WorkflowRuntimeStatisticsRecord - */ - public WorkflowRuntimeStatisticsRecord(UInteger workflowId, UInteger executionId, String operatorId, Timestamp time, UInteger inputTupleCnt, UInteger outputTupleCnt, Byte status, ULong dataProcessingTime, ULong controlProcessingTime, ULong idleTime, UInteger numWorkers) { - super(WorkflowRuntimeStatistics.WORKFLOW_RUNTIME_STATISTICS); - - set(0, workflowId); - set(1, executionId); - set(2, operatorId); - set(3, time); - set(4, inputTupleCnt); - set(5, outputTupleCnt); - set(6, status); - set(7, dataProcessingTime); - set(8, controlProcessingTime); - set(9, idleTime); - set(10, numWorkers); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserAccessRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserAccessRecord.java deleted file mode 100644 index aee2234dd3c..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserAccessRecord.java +++ /dev/null @@ -1,206 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.enums.WorkflowUserAccessPrivilege; -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserAccess; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserAccess; -import org.jooq.Field; -import org.jooq.Record2; -import org.jooq.Record3; -import org.jooq.Row3; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserAccessRecord extends UpdatableRecordImpl implements Record3, IWorkflowUserAccess { - - private static final long serialVersionUID = 994708214; - - /** - * Setter for texera_db.workflow_user_access.uid. - */ - @Override - public void setUid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_user_access.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_user_access.wid. - */ - @Override - public void setWid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_user_access.wid. - */ - @Override - public UInteger getWid() { - return (UInteger) get(1); - } - - /** - * Setter for texera_db.workflow_user_access.privilege. - */ - @Override - public void setPrivilege(WorkflowUserAccessPrivilege value) { - set(2, value); - } - - /** - * Getter for texera_db.workflow_user_access.privilege. - */ - @Override - public WorkflowUserAccessPrivilege getPrivilege() { - return (WorkflowUserAccessPrivilege) get(2); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record2 key() { - return (Record2) super.key(); - } - - // ------------------------------------------------------------------------- - // Record3 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row3 fieldsRow() { - return (Row3) super.fieldsRow(); - } - - @Override - public Row3 valuesRow() { - return (Row3) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowUserAccess.WORKFLOW_USER_ACCESS.UID; - } - - @Override - public Field field2() { - return WorkflowUserAccess.WORKFLOW_USER_ACCESS.WID; - } - - @Override - public Field field3() { - return WorkflowUserAccess.WORKFLOW_USER_ACCESS.PRIVILEGE; - } - - @Override - public UInteger component1() { - return getUid(); - } - - @Override - public UInteger component2() { - return getWid(); - } - - @Override - public WorkflowUserAccessPrivilege component3() { - return getPrivilege(); - } - - @Override - public UInteger value1() { - return getUid(); - } - - @Override - public UInteger value2() { - return getWid(); - } - - @Override - public WorkflowUserAccessPrivilege value3() { - return getPrivilege(); - } - - @Override - public WorkflowUserAccessRecord value1(UInteger value) { - setUid(value); - return this; - } - - @Override - public WorkflowUserAccessRecord value2(UInteger value) { - setWid(value); - return this; - } - - @Override - public WorkflowUserAccessRecord value3(WorkflowUserAccessPrivilege value) { - setPrivilege(value); - return this; - } - - @Override - public WorkflowUserAccessRecord values(UInteger value1, UInteger value2, WorkflowUserAccessPrivilege value3) { - value1(value1); - value2(value2); - value3(value3); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowUserAccess from) { - setUid(from.getUid()); - setWid(from.getWid()); - setPrivilege(from.getPrivilege()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowUserAccessRecord - */ - public WorkflowUserAccessRecord() { - super(WorkflowUserAccess.WORKFLOW_USER_ACCESS); - } - - /** - * Create a detached, initialised WorkflowUserAccessRecord - */ - public WorkflowUserAccessRecord(UInteger uid, UInteger wid, WorkflowUserAccessPrivilege privilege) { - super(WorkflowUserAccess.WORKFLOW_USER_ACCESS); - - set(0, uid); - set(1, wid); - set(2, privilege); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserActivityRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserActivityRecord.java deleted file mode 100644 index 9ed813c9d5a..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserActivityRecord.java +++ /dev/null @@ -1,277 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserActivity; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserActivity; -import org.jooq.Field; -import org.jooq.Record5; -import org.jooq.Row5; -import org.jooq.impl.TableRecordImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserActivityRecord extends TableRecordImpl implements Record5, IWorkflowUserActivity { - - private static final long serialVersionUID = 2045137390; - - /** - * Setter for texera_db.workflow_user_activity.uid. - */ - @Override - public void setUid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_user_activity.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_user_activity.wid. - */ - @Override - public void setWid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_user_activity.wid. - */ - @Override - public UInteger getWid() { - return (UInteger) get(1); - } - - /** - * Setter for texera_db.workflow_user_activity.ip. - */ - @Override - public void setIp(String value) { - set(2, value); - } - - /** - * Getter for texera_db.workflow_user_activity.ip. - */ - @Override - public String getIp() { - return (String) get(2); - } - - /** - * Setter for texera_db.workflow_user_activity.activate. - */ - @Override - public void setActivate(String value) { - set(3, value); - } - - /** - * Getter for texera_db.workflow_user_activity.activate. - */ - @Override - public String getActivate() { - return (String) get(3); - } - - /** - * Setter for texera_db.workflow_user_activity.activity_time. - */ - @Override - public void setActivityTime(Timestamp value) { - set(4, value); - } - - /** - * Getter for texera_db.workflow_user_activity.activity_time. - */ - @Override - public Timestamp getActivityTime() { - return (Timestamp) get(4); - } - - // ------------------------------------------------------------------------- - // Record5 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row5 fieldsRow() { - return (Row5) super.fieldsRow(); - } - - @Override - public Row5 valuesRow() { - return (Row5) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowUserActivity.WORKFLOW_USER_ACTIVITY.UID; - } - - @Override - public Field field2() { - return WorkflowUserActivity.WORKFLOW_USER_ACTIVITY.WID; - } - - @Override - public Field field3() { - return WorkflowUserActivity.WORKFLOW_USER_ACTIVITY.IP; - } - - @Override - public Field field4() { - return WorkflowUserActivity.WORKFLOW_USER_ACTIVITY.ACTIVATE; - } - - @Override - public Field field5() { - return WorkflowUserActivity.WORKFLOW_USER_ACTIVITY.ACTIVITY_TIME; - } - - @Override - public UInteger component1() { - return getUid(); - } - - @Override - public UInteger component2() { - return getWid(); - } - - @Override - public String component3() { - return getIp(); - } - - @Override - public String component4() { - return getActivate(); - } - - @Override - public Timestamp component5() { - return getActivityTime(); - } - - @Override - public UInteger value1() { - return getUid(); - } - - @Override - public UInteger value2() { - return getWid(); - } - - @Override - public String value3() { - return getIp(); - } - - @Override - public String value4() { - return getActivate(); - } - - @Override - public Timestamp value5() { - return getActivityTime(); - } - - @Override - public WorkflowUserActivityRecord value1(UInteger value) { - setUid(value); - return this; - } - - @Override - public WorkflowUserActivityRecord value2(UInteger value) { - setWid(value); - return this; - } - - @Override - public WorkflowUserActivityRecord value3(String value) { - setIp(value); - return this; - } - - @Override - public WorkflowUserActivityRecord value4(String value) { - setActivate(value); - return this; - } - - @Override - public WorkflowUserActivityRecord value5(Timestamp value) { - setActivityTime(value); - return this; - } - - @Override - public WorkflowUserActivityRecord values(UInteger value1, UInteger value2, String value3, String value4, Timestamp value5) { - value1(value1); - value2(value2); - value3(value3); - value4(value4); - value5(value5); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowUserActivity from) { - setUid(from.getUid()); - setWid(from.getWid()); - setIp(from.getIp()); - setActivate(from.getActivate()); - setActivityTime(from.getActivityTime()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowUserActivityRecord - */ - public WorkflowUserActivityRecord() { - super(WorkflowUserActivity.WORKFLOW_USER_ACTIVITY); - } - - /** - * Create a detached, initialised WorkflowUserActivityRecord - */ - public WorkflowUserActivityRecord(UInteger uid, UInteger wid, String ip, String activate, Timestamp activityTime) { - super(WorkflowUserActivity.WORKFLOW_USER_ACTIVITY); - - set(0, uid); - set(1, wid); - set(2, ip); - set(3, activate); - set(4, activityTime); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserClonesRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserClonesRecord.java deleted file mode 100644 index 4c97870a473..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserClonesRecord.java +++ /dev/null @@ -1,164 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserClones; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserClones; -import org.jooq.Field; -import org.jooq.Record2; -import org.jooq.Row2; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserClonesRecord extends UpdatableRecordImpl implements Record2, IWorkflowUserClones { - - private static final long serialVersionUID = 1619439529; - - /** - * Setter for texera_db.workflow_user_clones.uid. - */ - @Override - public void setUid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_user_clones.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_user_clones.wid. - */ - @Override - public void setWid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_user_clones.wid. - */ - @Override - public UInteger getWid() { - return (UInteger) get(1); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record2 key() { - return (Record2) super.key(); - } - - // ------------------------------------------------------------------------- - // Record2 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } - - @Override - public Row2 valuesRow() { - return (Row2) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowUserClones.WORKFLOW_USER_CLONES.UID; - } - - @Override - public Field field2() { - return WorkflowUserClones.WORKFLOW_USER_CLONES.WID; - } - - @Override - public UInteger component1() { - return getUid(); - } - - @Override - public UInteger component2() { - return getWid(); - } - - @Override - public UInteger value1() { - return getUid(); - } - - @Override - public UInteger value2() { - return getWid(); - } - - @Override - public WorkflowUserClonesRecord value1(UInteger value) { - setUid(value); - return this; - } - - @Override - public WorkflowUserClonesRecord value2(UInteger value) { - setWid(value); - return this; - } - - @Override - public WorkflowUserClonesRecord values(UInteger value1, UInteger value2) { - value1(value1); - value2(value2); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowUserClones from) { - setUid(from.getUid()); - setWid(from.getWid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowUserClonesRecord - */ - public WorkflowUserClonesRecord() { - super(WorkflowUserClones.WORKFLOW_USER_CLONES); - } - - /** - * Create a detached, initialised WorkflowUserClonesRecord - */ - public WorkflowUserClonesRecord(UInteger uid, UInteger wid) { - super(WorkflowUserClones.WORKFLOW_USER_CLONES); - - set(0, uid); - set(1, wid); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserLikesRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserLikesRecord.java deleted file mode 100644 index af0704c4779..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowUserLikesRecord.java +++ /dev/null @@ -1,164 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowUserLikes; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowUserLikes; -import org.jooq.Field; -import org.jooq.Record2; -import org.jooq.Row2; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowUserLikesRecord extends UpdatableRecordImpl implements Record2, IWorkflowUserLikes { - - private static final long serialVersionUID = -1129677921; - - /** - * Setter for texera_db.workflow_user_likes.uid. - */ - @Override - public void setUid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_user_likes.uid. - */ - @Override - public UInteger getUid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_user_likes.wid. - */ - @Override - public void setWid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_user_likes.wid. - */ - @Override - public UInteger getWid() { - return (UInteger) get(1); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record2 key() { - return (Record2) super.key(); - } - - // ------------------------------------------------------------------------- - // Record2 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } - - @Override - public Row2 valuesRow() { - return (Row2) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowUserLikes.WORKFLOW_USER_LIKES.UID; - } - - @Override - public Field field2() { - return WorkflowUserLikes.WORKFLOW_USER_LIKES.WID; - } - - @Override - public UInteger component1() { - return getUid(); - } - - @Override - public UInteger component2() { - return getWid(); - } - - @Override - public UInteger value1() { - return getUid(); - } - - @Override - public UInteger value2() { - return getWid(); - } - - @Override - public WorkflowUserLikesRecord value1(UInteger value) { - setUid(value); - return this; - } - - @Override - public WorkflowUserLikesRecord value2(UInteger value) { - setWid(value); - return this; - } - - @Override - public WorkflowUserLikesRecord values(UInteger value1, UInteger value2) { - value1(value1); - value2(value2); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowUserLikes from) { - setUid(from.getUid()); - setWid(from.getWid()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowUserLikesRecord - */ - public WorkflowUserLikesRecord() { - super(WorkflowUserLikes.WORKFLOW_USER_LIKES); - } - - /** - * Create a detached, initialised WorkflowUserLikesRecord - */ - public WorkflowUserLikesRecord(UInteger uid, UInteger wid) { - super(WorkflowUserLikes.WORKFLOW_USER_LIKES); - - set(0, uid); - set(1, wid); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowVersionRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowVersionRecord.java deleted file mode 100644 index 269ff75ee9a..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowVersionRecord.java +++ /dev/null @@ -1,247 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowVersion; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowVersion; -import org.jooq.Field; -import org.jooq.Record1; -import org.jooq.Record4; -import org.jooq.Row4; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - -import java.sql.Timestamp; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowVersionRecord extends UpdatableRecordImpl implements Record4, IWorkflowVersion { - - private static final long serialVersionUID = 951015556; - - /** - * Setter for texera_db.workflow_version.vid. - */ - @Override - public void setVid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_version.vid. - */ - @Override - public UInteger getVid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_version.wid. - */ - @Override - public void setWid(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_version.wid. - */ - @Override - public UInteger getWid() { - return (UInteger) get(1); - } - - /** - * Setter for texera_db.workflow_version.content. - */ - @Override - public void setContent(String value) { - set(2, value); - } - - /** - * Getter for texera_db.workflow_version.content. - */ - @Override - public String getContent() { - return (String) get(2); - } - - /** - * Setter for texera_db.workflow_version.creation_time. - */ - @Override - public void setCreationTime(Timestamp value) { - set(3, value); - } - - /** - * Getter for texera_db.workflow_version.creation_time. - */ - @Override - public Timestamp getCreationTime() { - return (Timestamp) get(3); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record1 key() { - return (Record1) super.key(); - } - - // ------------------------------------------------------------------------- - // Record4 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row4 fieldsRow() { - return (Row4) super.fieldsRow(); - } - - @Override - public Row4 valuesRow() { - return (Row4) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowVersion.WORKFLOW_VERSION.VID; - } - - @Override - public Field field2() { - return WorkflowVersion.WORKFLOW_VERSION.WID; - } - - @Override - public Field field3() { - return WorkflowVersion.WORKFLOW_VERSION.CONTENT; - } - - @Override - public Field field4() { - return WorkflowVersion.WORKFLOW_VERSION.CREATION_TIME; - } - - @Override - public UInteger component1() { - return getVid(); - } - - @Override - public UInteger component2() { - return getWid(); - } - - @Override - public String component3() { - return getContent(); - } - - @Override - public Timestamp component4() { - return getCreationTime(); - } - - @Override - public UInteger value1() { - return getVid(); - } - - @Override - public UInteger value2() { - return getWid(); - } - - @Override - public String value3() { - return getContent(); - } - - @Override - public Timestamp value4() { - return getCreationTime(); - } - - @Override - public WorkflowVersionRecord value1(UInteger value) { - setVid(value); - return this; - } - - @Override - public WorkflowVersionRecord value2(UInteger value) { - setWid(value); - return this; - } - - @Override - public WorkflowVersionRecord value3(String value) { - setContent(value); - return this; - } - - @Override - public WorkflowVersionRecord value4(Timestamp value) { - setCreationTime(value); - return this; - } - - @Override - public WorkflowVersionRecord values(UInteger value1, UInteger value2, String value3, Timestamp value4) { - value1(value1); - value2(value2); - value3(value3); - value4(value4); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowVersion from) { - setVid(from.getVid()); - setWid(from.getWid()); - setContent(from.getContent()); - setCreationTime(from.getCreationTime()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowVersionRecord - */ - public WorkflowVersionRecord() { - super(WorkflowVersion.WORKFLOW_VERSION); - } - - /** - * Create a detached, initialised WorkflowVersionRecord - */ - public WorkflowVersionRecord(UInteger vid, UInteger wid, String content, Timestamp creationTime) { - super(WorkflowVersion.WORKFLOW_VERSION); - - set(0, vid); - set(1, wid); - set(2, content); - set(3, creationTime); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowViewCountRecord.java b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowViewCountRecord.java deleted file mode 100644 index 94e184e9006..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/jooq/generated/tables/records/WorkflowViewCountRecord.java +++ /dev/null @@ -1,165 +0,0 @@ -/* - * This file is generated by jOOQ. - */ -package edu.uci.ics.texera.web.model.jooq.generated.tables.records; - - -import edu.uci.ics.texera.web.model.jooq.generated.tables.WorkflowViewCount; -import edu.uci.ics.texera.web.model.jooq.generated.tables.interfaces.IWorkflowViewCount; -import org.jooq.Field; -import org.jooq.Record1; -import org.jooq.Record2; -import org.jooq.Row2; -import org.jooq.impl.UpdatableRecordImpl; -import org.jooq.types.UInteger; - - -/** - * This class is generated by jOOQ. - */ -@SuppressWarnings({"all", "unchecked", "rawtypes"}) -public class WorkflowViewCountRecord extends UpdatableRecordImpl implements Record2, IWorkflowViewCount { - - private static final long serialVersionUID = -1459174754; - - /** - * Setter for texera_db.workflow_view_count.wid. - */ - @Override - public void setWid(UInteger value) { - set(0, value); - } - - /** - * Getter for texera_db.workflow_view_count.wid. - */ - @Override - public UInteger getWid() { - return (UInteger) get(0); - } - - /** - * Setter for texera_db.workflow_view_count.view_count. - */ - @Override - public void setViewCount(UInteger value) { - set(1, value); - } - - /** - * Getter for texera_db.workflow_view_count.view_count. - */ - @Override - public UInteger getViewCount() { - return (UInteger) get(1); - } - - // ------------------------------------------------------------------------- - // Primary key information - // ------------------------------------------------------------------------- - - @Override - public Record1 key() { - return (Record1) super.key(); - } - - // ------------------------------------------------------------------------- - // Record2 type implementation - // ------------------------------------------------------------------------- - - @Override - public Row2 fieldsRow() { - return (Row2) super.fieldsRow(); - } - - @Override - public Row2 valuesRow() { - return (Row2) super.valuesRow(); - } - - @Override - public Field field1() { - return WorkflowViewCount.WORKFLOW_VIEW_COUNT.WID; - } - - @Override - public Field field2() { - return WorkflowViewCount.WORKFLOW_VIEW_COUNT.VIEW_COUNT; - } - - @Override - public UInteger component1() { - return getWid(); - } - - @Override - public UInteger component2() { - return getViewCount(); - } - - @Override - public UInteger value1() { - return getWid(); - } - - @Override - public UInteger value2() { - return getViewCount(); - } - - @Override - public WorkflowViewCountRecord value1(UInteger value) { - setWid(value); - return this; - } - - @Override - public WorkflowViewCountRecord value2(UInteger value) { - setViewCount(value); - return this; - } - - @Override - public WorkflowViewCountRecord values(UInteger value1, UInteger value2) { - value1(value1); - value2(value2); - return this; - } - - // ------------------------------------------------------------------------- - // FROM and INTO - // ------------------------------------------------------------------------- - - @Override - public void from(IWorkflowViewCount from) { - setWid(from.getWid()); - setViewCount(from.getViewCount()); - } - - @Override - public E into(E into) { - into.from(this); - return into; - } - - // ------------------------------------------------------------------------- - // Constructors - // ------------------------------------------------------------------------- - - /** - * Create a detached WorkflowViewCountRecord - */ - public WorkflowViewCountRecord() { - super(WorkflowViewCount.WORKFLOW_VIEW_COUNT); - } - - /** - * Create a detached, initialised WorkflowViewCountRecord - */ - public WorkflowViewCountRecord(UInteger wid, UInteger viewCount) { - super(WorkflowViewCount.WORKFLOW_VIEW_COUNT); - - set(0, wid); - set(1, viewCount); - } -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/CollaborationResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/CollaborationResource.scala index c0a4a33934b..ac782c93102 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/CollaborationResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/CollaborationResource.scala @@ -6,7 +6,7 @@ import edu.uci.ics.texera.web.ServletAwareConfigurator import edu.uci.ics.texera.web.model.collab.event._ import edu.uci.ics.texera.web.model.collab.request._ import edu.uci.ics.texera.web.model.collab.response.HeartBeatResponse -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.resource.CollaborationResource._ import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowAccessResource import org.jooq.types.UInteger diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/UserConfigResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/UserConfigResource.scala index d1d59b64b35..ff364962ee9 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/UserConfigResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/UserConfigResource.scala @@ -3,9 +3,9 @@ package edu.uci.ics.texera.web.resource import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables.USER_CONFIG -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.UserConfigDao -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.{User, UserConfig} +import edu.uci.ics.texera.dao.jooq.generated.Tables.USER_CONFIG +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.UserConfigDao +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.{User, UserConfig} import io.dropwizard.auth.Auth import javax.annotation.security.RolesAllowed diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/WorkflowWebsocketResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/WorkflowWebsocketResource.scala index 803d9aef533..a14843596be 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/WorkflowWebsocketResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/WorkflowWebsocketResource.scala @@ -8,7 +8,7 @@ import edu.uci.ics.amber.error.ErrorUtils.getStackTraceWithAllCauses import edu.uci.ics.amber.virtualidentity.WorkflowIdentity import edu.uci.ics.amber.workflowruntimestate.FatalErrorType.COMPILATION_ERROR import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.model.websocket.event.{WorkflowErrorEvent, WorkflowStateEvent} import edu.uci.ics.texera.web.model.websocket.request._ import edu.uci.ics.texera.web.model.websocket.response._ diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/auth/AuthResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/auth/AuthResource.scala index 3634de1b1b3..513d5dbeaef 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/auth/AuthResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/auth/AuthResource.scala @@ -10,10 +10,10 @@ import edu.uci.ics.texera.web.model.http.request.auth.{ UserRegistrationRequest } import edu.uci.ics.texera.web.model.http.response.TokenIssueResponse -import edu.uci.ics.texera.web.model.jooq.generated.Tables.USER -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.UserDao -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.Tables.USER +import edu.uci.ics.texera.dao.jooq.generated.enums.UserRole +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.UserDao +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.resource.auth.AuthResource._ import org.jasypt.util.password.StrongPasswordEncryptor diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/auth/GoogleAuthResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/auth/GoogleAuthResource.scala index 69942af964a..96d4a4e61da 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/auth/GoogleAuthResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/auth/GoogleAuthResource.scala @@ -13,9 +13,9 @@ import edu.uci.ics.texera.web.auth.JwtAuth.{ jwtToken } import edu.uci.ics.texera.web.model.http.response.TokenIssueResponse -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.UserDao -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.enums.UserRole +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.UserDao +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.resource.auth.GoogleAuthResource.userDao import java.util.Collections diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/DashboardResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/DashboardResource.scala index fdd7dc125de..acd25618bb3 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/DashboardResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/DashboardResource.scala @@ -1,8 +1,8 @@ package edu.uci.ics.texera.web.resource.dashboard import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables._ -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos._ +import edu.uci.ics.texera.dao.jooq.generated.Tables._ +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos._ import edu.uci.ics.texera.web.resource.dashboard.DashboardResource._ import edu.uci.ics.texera.web.resource.dashboard.SearchQueryBuilder.{ALL_RESOURCE_TYPE, context} import edu.uci.ics.texera.web.resource.dashboard.user.dataset.DatasetResource.DashboardDataset diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/DatasetSearchQueryBuilder.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/DatasetSearchQueryBuilder.scala index 3fb1d3671f0..7157e4f9c40 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/DatasetSearchQueryBuilder.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/DatasetSearchQueryBuilder.scala @@ -1,9 +1,9 @@ package edu.uci.ics.texera.web.resource.dashboard -import edu.uci.ics.texera.web.model.jooq.generated.Tables.{DATASET, DATASET_USER_ACCESS} -import edu.uci.ics.texera.web.model.jooq.generated.enums.DatasetUserAccessPrivilege -import edu.uci.ics.texera.web.model.jooq.generated.tables.User.USER -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.{Dataset, User} +import edu.uci.ics.texera.dao.jooq.generated.Tables.{DATASET, DATASET_USER_ACCESS} +import edu.uci.ics.texera.dao.jooq.generated.enums.DatasetUserAccessPrivilege +import edu.uci.ics.texera.dao.jooq.generated.tables.User.USER +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.{Dataset, User} import edu.uci.ics.texera.web.resource.dashboard.DashboardResource.DashboardClickableFileEntry import edu.uci.ics.texera.web.resource.dashboard.FulltextSearchQueryUtils.{ getContainsFilter, diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/ProjectSearchQueryBuilder.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/ProjectSearchQueryBuilder.scala index 9c864c3f4db..a76d2b72a1b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/ProjectSearchQueryBuilder.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/ProjectSearchQueryBuilder.scala @@ -1,7 +1,7 @@ package edu.uci.ics.texera.web.resource.dashboard -import edu.uci.ics.texera.web.model.jooq.generated.Tables.{PROJECT, PROJECT_USER_ACCESS} -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Project +import edu.uci.ics.texera.dao.jooq.generated.Tables.{PROJECT, PROJECT_USER_ACCESS} +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.Project import edu.uci.ics.texera.web.resource.dashboard.DashboardResource.DashboardClickableFileEntry import edu.uci.ics.texera.web.resource.dashboard.FulltextSearchQueryUtils.{ getContainsFilter, diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/UnifiedResourceSchema.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/UnifiedResourceSchema.scala index a7b0c1abd93..6b526d54443 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/UnifiedResourceSchema.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/UnifiedResourceSchema.scala @@ -2,7 +2,7 @@ package edu.uci.ics.texera.web.resource.dashboard import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer -import edu.uci.ics.texera.web.model.jooq.generated.enums.{ +import edu.uci.ics.texera.dao.jooq.generated.enums.{ DatasetUserAccessPrivilege, WorkflowUserAccessPrivilege } diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/WorkflowSearchQueryBuilder.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/WorkflowSearchQueryBuilder.scala index ceaeb54cc85..db43562a742 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/WorkflowSearchQueryBuilder.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/WorkflowSearchQueryBuilder.scala @@ -1,7 +1,7 @@ package edu.uci.ics.texera.web.resource.dashboard -import edu.uci.ics.texera.web.model.jooq.generated.Tables._ -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.Workflow +import edu.uci.ics.texera.dao.jooq.generated.Tables._ +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.Workflow import edu.uci.ics.texera.web.resource.dashboard.DashboardResource.DashboardClickableFileEntry import edu.uci.ics.texera.web.resource.dashboard.FulltextSearchQueryUtils._ import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowResource.DashboardWorkflow diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/admin/execution/AdminExecutionResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/admin/execution/AdminExecutionResource.scala index 0186304317d..5bdb31437ee 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/admin/execution/AdminExecutionResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/admin/execution/AdminExecutionResource.scala @@ -3,7 +3,7 @@ package edu.uci.ics.texera.web.resource.dashboard.admin.execution import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables._ +import edu.uci.ics.texera.dao.jooq.generated.Tables._ import edu.uci.ics.texera.web.resource.dashboard.admin.execution.AdminExecutionResource._ import io.dropwizard.auth.Auth import org.jooq.types.UInteger diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/admin/user/AdminUserResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/admin/user/AdminUserResource.scala index a09e3115b58..dccb8d70b05 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/admin/user/AdminUserResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/admin/user/AdminUserResource.scala @@ -2,9 +2,9 @@ package edu.uci.ics.texera.web.resource.dashboard.admin.user import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.UserDao -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.enums.UserRole +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.UserDao +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.resource.dashboard.admin.user.AdminUserResource.userDao import edu.uci.ics.texera.web.resource.dashboard.user.quota.UserQuotaResource._ import org.jasypt.util.password.StrongPasswordEncryptor diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/hub/workflow/HubWorkflowResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/hub/workflow/HubWorkflowResource.scala index e699793013d..15022a658a8 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/hub/workflow/HubWorkflowResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/hub/workflow/HubWorkflowResource.scala @@ -2,9 +2,9 @@ package edu.uci.ics.texera.web.resource.dashboard.hub.workflow import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer -import edu.uci.ics.texera.web.model.jooq.generated.Tables._ -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.WorkflowDao -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.{User, Workflow} +import edu.uci.ics.texera.dao.jooq.generated.Tables._ +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.WorkflowDao +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.{User, Workflow} import edu.uci.ics.texera.web.resource.dashboard.hub.workflow.HubWorkflowResource.{ fetchDashboardWorkflowsByWids, recordUserActivity diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/DatasetAccessResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/DatasetAccessResource.scala index ae4a629cb05..4f54decbf5e 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/DatasetAccessResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/DatasetAccessResource.scala @@ -4,16 +4,12 @@ import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.amber.engine.common.Utils.withTransaction import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.model.common.AccessEntry -import edu.uci.ics.texera.web.model.jooq.generated.Tables.USER -import edu.uci.ics.texera.web.model.jooq.generated.enums.DatasetUserAccessPrivilege -import edu.uci.ics.texera.web.model.jooq.generated.tables.Dataset.DATASET -import edu.uci.ics.texera.web.model.jooq.generated.tables.DatasetUserAccess.DATASET_USER_ACCESS -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.{ - DatasetDao, - DatasetUserAccessDao, - UserDao -} -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.{DatasetUserAccess, User} +import edu.uci.ics.texera.dao.jooq.generated.Tables.USER +import edu.uci.ics.texera.dao.jooq.generated.enums.DatasetUserAccessPrivilege +import edu.uci.ics.texera.dao.jooq.generated.tables.Dataset.DATASET +import edu.uci.ics.texera.dao.jooq.generated.tables.DatasetUserAccess.DATASET_USER_ACCESS +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.{DatasetDao, DatasetUserAccessDao, UserDao} +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.{DatasetUserAccess, User} import edu.uci.ics.texera.web.resource.dashboard.user.dataset.DatasetAccessResource.{ context, getOwner diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/DatasetResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/DatasetResource.scala index 81798897f7c..36d43e9905b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/DatasetResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/DatasetResource.scala @@ -10,17 +10,17 @@ import edu.uci.ics.amber.engine.common.Utils.withTransaction import edu.uci.ics.amber.util.PathUtils import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.enums.DatasetUserAccessPrivilege -import edu.uci.ics.texera.web.model.jooq.generated.tables.Dataset.DATASET -import edu.uci.ics.texera.web.model.jooq.generated.tables.DatasetUserAccess.DATASET_USER_ACCESS -import edu.uci.ics.texera.web.model.jooq.generated.tables.DatasetVersion.DATASET_VERSION -import edu.uci.ics.texera.web.model.jooq.generated.tables.User.USER -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.{ +import edu.uci.ics.texera.dao.jooq.generated.enums.DatasetUserAccessPrivilege +import edu.uci.ics.texera.dao.jooq.generated.tables.Dataset.DATASET +import edu.uci.ics.texera.dao.jooq.generated.tables.DatasetUserAccess.DATASET_USER_ACCESS +import edu.uci.ics.texera.dao.jooq.generated.tables.DatasetVersion.DATASET_VERSION +import edu.uci.ics.texera.dao.jooq.generated.tables.User.USER +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.{ DatasetDao, DatasetUserAccessDao, DatasetVersionDao } -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.{ +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.{ Dataset, DatasetUserAccess, DatasetVersion, diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/utils/DatasetStatisticsUtils.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/utils/DatasetStatisticsUtils.scala index 63d1ee03ce3..561a50f2f02 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/utils/DatasetStatisticsUtils.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/dataset/utils/DatasetStatisticsUtils.scala @@ -2,7 +2,7 @@ package edu.uci.ics.texera.web.resource.dashboard.user.dataset.utils import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer -import edu.uci.ics.texera.web.model.jooq.generated.tables.Dataset.DATASET +import edu.uci.ics.texera.dao.jooq.generated.tables.Dataset.DATASET import edu.uci.ics.texera.web.resource.dashboard.user.dataset.DatasetResource import edu.uci.ics.texera.web.resource.dashboard.user.quota.UserQuotaResource.DatasetQuota import org.jooq.types.UInteger diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/ProjectAccessResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/ProjectAccessResource.scala index 31021c316d0..59e6b5b7d38 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/ProjectAccessResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/ProjectAccessResource.scala @@ -3,14 +3,10 @@ package edu.uci.ics.texera.web.resource.dashboard.user.project import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.model.common.AccessEntry -import edu.uci.ics.texera.web.model.jooq.generated.Tables.{PROJECT_USER_ACCESS, USER} -import edu.uci.ics.texera.web.model.jooq.generated.enums.ProjectUserAccessPrivilege -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.{ - ProjectDao, - ProjectUserAccessDao, - UserDao -} -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.ProjectUserAccess +import edu.uci.ics.texera.dao.jooq.generated.Tables.{PROJECT_USER_ACCESS, USER} +import edu.uci.ics.texera.dao.jooq.generated.enums.ProjectUserAccessPrivilege +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.{ProjectDao, ProjectUserAccessDao, UserDao} +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.ProjectUserAccess import org.jooq.DSLContext import org.jooq.types.UInteger diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/ProjectResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/ProjectResource.scala index 7d0a667a2bc..9963721cd57 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/ProjectResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/ProjectResource.scala @@ -3,14 +3,14 @@ package edu.uci.ics.texera.web.resource.dashboard.user.project import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables._ -import edu.uci.ics.texera.web.model.jooq.generated.enums.ProjectUserAccessPrivilege -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.{ +import edu.uci.ics.texera.dao.jooq.generated.Tables._ +import edu.uci.ics.texera.dao.jooq.generated.enums.ProjectUserAccessPrivilege +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.{ ProjectDao, ProjectUserAccessDao, WorkflowOfProjectDao } -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos._ +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos._ import edu.uci.ics.texera.web.resource.dashboard.DashboardResource import edu.uci.ics.texera.web.resource.dashboard.DashboardResource.SearchQueryParams import edu.uci.ics.texera.web.resource.dashboard.user.project.ProjectResource._ diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/PublicProjectResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/PublicProjectResource.scala index 0240bbce338..9e20262d7e4 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/PublicProjectResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/project/PublicProjectResource.scala @@ -3,13 +3,10 @@ package edu.uci.ics.texera.web.resource.dashboard.user.project import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables.{PROJECT, PUBLIC_PROJECT, USER} -import edu.uci.ics.texera.web.model.jooq.generated.enums.ProjectUserAccessPrivilege -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.{ - ProjectUserAccessDao, - PublicProjectDao -} -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.{ProjectUserAccess, PublicProject} +import edu.uci.ics.texera.dao.jooq.generated.Tables.{PROJECT, PUBLIC_PROJECT, USER} +import edu.uci.ics.texera.dao.jooq.generated.enums.ProjectUserAccessPrivilege +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.{ProjectUserAccessDao, PublicProjectDao} +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.{ProjectUserAccess, PublicProject} import io.dropwizard.auth.Auth import org.jooq.DSLContext import org.jooq.types.UInteger diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/quota/UserQuotaResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/quota/UserQuotaResource.scala index 7f55b4ed566..4a9f78f44ef 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/quota/UserQuotaResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/quota/UserQuotaResource.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.core.storage.util.mongo.MongoDatabaseManager import edu.uci.ics.amber.core.storage.util.mongo.MongoDatabaseManager.database import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables._ +import edu.uci.ics.texera.dao.jooq.generated.Tables._ import edu.uci.ics.texera.web.resource.dashboard.user.dataset.utils.DatasetStatisticsUtils.getUserCreatedDatasets import edu.uci.ics.texera.web.resource.dashboard.user.quota.UserQuotaResource._ import io.dropwizard.auth.Auth diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowAccessResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowAccessResource.scala index 5e4a1b8cfc3..9adce4f3eb5 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowAccessResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowAccessResource.scala @@ -4,14 +4,14 @@ import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser import edu.uci.ics.texera.web.model.common.AccessEntry -import edu.uci.ics.texera.web.model.jooq.generated.Tables._ -import edu.uci.ics.texera.web.model.jooq.generated.enums.WorkflowUserAccessPrivilege -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.{ +import edu.uci.ics.texera.dao.jooq.generated.Tables._ +import edu.uci.ics.texera.dao.jooq.generated.enums.WorkflowUserAccessPrivilege +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.{ UserDao, WorkflowOfUserDao, WorkflowUserAccessDao } -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowUserAccess +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.WorkflowUserAccess import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowAccessResource.context import io.dropwizard.auth.Auth import org.jooq.DSLContext diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala index 0f5d4e55e2b..8758fab0b85 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala @@ -6,17 +6,17 @@ import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage import edu.uci.ics.amber.virtualidentity.{ChannelMarkerIdentity, ExecutionIdentity} import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables.{ +import edu.uci.ics.texera.dao.jooq.generated.Tables.{ USER, WORKFLOW_EXECUTIONS, WORKFLOW_RUNTIME_STATISTICS, WORKFLOW_VERSION } -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.{ +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.{ WorkflowExecutionsDao, WorkflowRuntimeStatisticsDao } -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.{ +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.{ WorkflowExecutions, WorkflowRuntimeStatistics } diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowResource.scala index 6e8d226f75f..0b8e5c6f3fa 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowResource.scala @@ -6,15 +6,15 @@ import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables._ -import edu.uci.ics.texera.web.model.jooq.generated.enums.WorkflowUserAccessPrivilege -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.{ +import edu.uci.ics.texera.dao.jooq.generated.Tables._ +import edu.uci.ics.texera.dao.jooq.generated.enums.WorkflowUserAccessPrivilege +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.{ WorkflowDao, WorkflowOfProjectDao, WorkflowOfUserDao, WorkflowUserAccessDao } -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos._ +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos._ import edu.uci.ics.texera.web.resource.dashboard.hub.workflow.HubWorkflowResource.recordUserActivity import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowAccessResource.hasReadAccess import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowResource._ diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowVersionResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowVersionResource.scala index 295985dacb2..ed2cfa95a37 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowVersionResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowVersionResource.scala @@ -6,9 +6,9 @@ import edu.uci.ics.amber.engine.common.AmberConfig import edu.uci.ics.amber.engine.common.Utils.objectMapper import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables.WORKFLOW_VERSION -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.{WorkflowDao, WorkflowVersionDao} -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.{Workflow, WorkflowVersion} +import edu.uci.ics.texera.dao.jooq.generated.Tables.WORKFLOW_VERSION +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.{WorkflowDao, WorkflowVersionDao} +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.{Workflow, WorkflowVersion} import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowResource.{ DashboardWorkflow, assignNewOperatorIds diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionStatsService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionStatsService.scala index bea692d76db..38a1098c58e 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionStatsService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionStatsService.scala @@ -24,7 +24,7 @@ import edu.uci.ics.amber.error.ErrorUtils.{getOperatorFromActorIdOpt, getStackTr import edu.uci.ics.amber.workflowruntimestate.FatalErrorType.EXECUTION_FAILURE import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError import edu.uci.ics.texera.web.SubscriptionManager -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowRuntimeStatistics +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.WorkflowRuntimeStatistics import edu.uci.ics.texera.web.model.websocket.event.{ ExecutionDurationUpdateEvent, OperatorAggregatedMetrics, diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionsMetadataPersistService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionsMetadataPersistService.scala index f83f5abfbf0..d8b259acdaa 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionsMetadataPersistService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionsMetadataPersistService.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.workflow.WorkflowContext.DEFAULT_EXECUTION_ID import edu.uci.ics.amber.engine.common.AmberConfig import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.texera.dao.SqlServer -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.WorkflowExecutionsDao -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.WorkflowExecutions +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.WorkflowExecutionsDao +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.WorkflowExecutions import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowVersionResource._ import org.jooq.types.UInteger diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala index 13c06944037..c2a0693bd85 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala @@ -13,7 +13,7 @@ import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.engine.common.Utils.retry import edu.uci.ics.amber.util.PathUtils import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, WorkflowIdentity} -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.model.websocket.request.ResultExportRequest import edu.uci.ics.texera.web.model.websocket.response.ResultExportResponse import edu.uci.ics.texera.web.resource.GoogleResource diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala index dc1b31f73a6..a4c3c0c0687 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala @@ -23,7 +23,7 @@ import edu.uci.ics.amber.virtualidentity.{ } import edu.uci.ics.amber.workflowruntimestate.FatalErrorType.EXECUTION_FAILURE import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.User +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.model.websocket.event.TexeraWebSocketEvent import edu.uci.ics.texera.web.model.websocket.request.WorkflowExecuteRequest import edu.uci.ics.texera.web.service.WorkflowService.mkWorkflowStateId diff --git a/core/amber/src/test/scala/edu/uci/ics/texera/web/resource/dashboard/file/WorkflowResourceSpec.scala b/core/amber/src/test/scala/edu/uci/ics/texera/web/resource/dashboard/file/WorkflowResourceSpec.scala index 54030c8c5d9..b6de9bb8fed 100644 --- a/core/amber/src/test/scala/edu/uci/ics/texera/web/resource/dashboard/file/WorkflowResourceSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/texera/web/resource/dashboard/file/WorkflowResourceSpec.scala @@ -2,10 +2,10 @@ package edu.uci.ics.texera.web.resource.dashboard.file import edu.uci.ics.texera.dao.MockTexeraDB import edu.uci.ics.texera.web.auth.SessionUser -import edu.uci.ics.texera.web.model.jooq.generated.Tables.{USER, WORKFLOW, WORKFLOW_OF_PROJECT} -import edu.uci.ics.texera.web.model.jooq.generated.enums.UserRole -import edu.uci.ics.texera.web.model.jooq.generated.tables.daos.UserDao -import edu.uci.ics.texera.web.model.jooq.generated.tables.pojos.{Project, User, Workflow} +import edu.uci.ics.texera.dao.jooq.generated.Tables.{USER, WORKFLOW, WORKFLOW_OF_PROJECT} +import edu.uci.ics.texera.dao.jooq.generated.enums.UserRole +import edu.uci.ics.texera.dao.jooq.generated.tables.daos.UserDao +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.{Project, User, Workflow} import edu.uci.ics.texera.web.resource.dashboard.DashboardResource.SearchQueryParams import edu.uci.ics.texera.web.resource.dashboard.user.project.ProjectResource import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowResource From 752306f137becd609c4fcf2f792c96f072a70312 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Tue, 17 Dec 2024 00:12:52 -0800 Subject: [PATCH 06/47] Modify texera io links in README (#3162) This PR modifies the links in README to texera.io. 1. Add link to landing page of texera.io on the texera logo. 2. Change urls to be more meaningful, instead of a numerical number. 3. Add links to publications, video, etc. --- README.md | 8 ++++++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 7c2f1dc2c51..99fe5967044 100644 --- a/README.md +++ b/README.md @@ -1,7 +1,7 @@

    Texera - Collaborative Data Science and AI/ML Using Workflows

    - texera-logo + texera-logo
    Texera supports scalable data computation and enables advanced AI/ML techniques.
    @@ -11,7 +11,11 @@

    Official Site | - Blogs + Publications + | + Video + | + Blog | Getting Started
    From 270f6d7d662a4c54c7df34ad8f0eaa8beae0a0a7 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Tue, 17 Dec 2024 00:37:17 -0800 Subject: [PATCH 07/47] Remove cache source descriptor (#3163) This PR removes the cache source descriptor. We explicitly create a physical operator to read cache during scheduling (cut off physical links for materialization). --- .../scheduling/ScheduleGenerator.scala | 53 ++++++++++++------- .../source/cache/CacheSourceOpDesc.scala | 45 ---------------- 2 files changed, 33 insertions(+), 65 deletions(-) delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpDesc.scala diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala index 217d77aa14e..800d8344fd3 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala @@ -1,17 +1,23 @@ package edu.uci.ics.amber.engine.architecture.scheduling +import edu.uci.ics.amber.core.executor.OpExecInitInfo import edu.uci.ics.amber.core.storage.result.{OpResultStorage, ResultStorage} import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{PhysicalOp, PhysicalPlan, WorkflowContext} +import edu.uci.ics.amber.core.workflow.{ + PhysicalOp, + PhysicalPlan, + SchemaPropagationFunc, + WorkflowContext +} import edu.uci.ics.amber.engine.architecture.scheduling.ScheduleGenerator.replaceVertex import edu.uci.ics.amber.engine.architecture.scheduling.resourcePolicies.{ DefaultResourceAllocator, ExecutionClusterInfo } import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc -import edu.uci.ics.amber.operator.source.cache.CacheSourceOpDesc +import edu.uci.ics.amber.operator.source.cache.CacheSourceOpExec import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, PhysicalOpIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.workflow.{OutputPort, PhysicalLink} import org.jgrapht.graph.DirectedAcyclicGraph import org.jgrapht.traverse.TopologicalOrderIterator @@ -155,7 +161,7 @@ abstract class ScheduleGenerator( // create cache writer and link val matWriterInputSchema = fromOp.outputPorts(fromPortId)._3.toOption.get val matWriterPhysicalOp: PhysicalOp = - createMatWriter(physicalLink, Array(matWriterInputSchema), workflowContext.workflowId) + createMatWriter(physicalLink, Array(matWriterInputSchema)) val sourceToWriterLink = PhysicalLink( fromOp.id, @@ -169,7 +175,7 @@ abstract class ScheduleGenerator( // create cache reader and link val matReaderPhysicalOp: PhysicalOp = - createMatReader(matWriterPhysicalOp.id.logicalOpId, physicalLink, workflowContext.workflowId) + createMatReader(matWriterPhysicalOp.id.logicalOpId, physicalLink) val readerToDestLink = PhysicalLink( matReaderPhysicalOp.id, @@ -186,20 +192,28 @@ abstract class ScheduleGenerator( private def createMatReader( matWriterLogicalOpId: OperatorIdentity, - physicalLink: PhysicalLink, - workflowIdentity: WorkflowIdentity + physicalLink: PhysicalLink ): PhysicalOp = { - val matReader = new CacheSourceOpDesc( - matWriterLogicalOpId, - ResultStorage.getOpResultStorage(workflowIdentity) - ) - matReader.setContext(workflowContext) - matReader.setOperatorId(s"cacheSource_${getMatIdFromPhysicalLink(physicalLink)}") - - matReader - .getPhysicalOp( + val opResultStorage = ResultStorage.getOpResultStorage(workflowContext.workflowId) + PhysicalOp + .sourcePhysicalOp( workflowContext.workflowId, - workflowContext.executionId + workflowContext.executionId, + OperatorIdentity(s"cacheSource_${getMatIdFromPhysicalLink(physicalLink)}"), + OpExecInitInfo((_, _) => + new CacheSourceOpExec( + opResultStorage.get(matWriterLogicalOpId) + ) + ) + ) + .withInputPorts(List.empty) + .withOutputPorts(List(OutputPort())) + .withPropagateSchema( + SchemaPropagationFunc(_ => + Map( + OutputPort().id -> opResultStorage.getSchema(matWriterLogicalOpId).get + ) + ) ) .propagateSchema() @@ -207,8 +221,7 @@ abstract class ScheduleGenerator( private def createMatWriter( physicalLink: PhysicalLink, - inputSchema: Array[Schema], - workflowIdentity: WorkflowIdentity + inputSchema: Array[Schema] ): PhysicalOp = { val matWriter = new ProgressiveSinkOpDesc() matWriter.setContext(workflowContext) @@ -216,7 +229,7 @@ abstract class ScheduleGenerator( // expect exactly one input port and one output port val schema = matWriter.getOutputSchema(inputSchema) ResultStorage - .getOpResultStorage(workflowIdentity) + .getOpResultStorage(workflowContext.workflowId) .create( key = matWriter.operatorIdentifier, mode = OpResultStorage.defaultStorageMode, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpDesc.scala deleted file mode 100644 index 535e2d315c3..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpDesc.scala +++ /dev/null @@ -1,45 +0,0 @@ -package edu.uci.ics.amber.operator.source.cache - -import edu.uci.ics.amber.core.executor.OpExecInitInfo -import edu.uci.ics.amber.core.storage.result.OpResultStorage -import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} -import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, OperatorIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.OutputPort - -class CacheSourceOpDesc(val targetSinkStorageId: OperatorIdentity, opResultStorage: OpResultStorage) - extends SourceOperatorDescriptor { - assert(null != targetSinkStorageId) - assert(null != opResultStorage) - - override def sourceSchema(): Schema = opResultStorage.getSchema(targetSinkStorageId).get - - override def getPhysicalOp( - workflowId: WorkflowIdentity, - executionId: ExecutionIdentity - ): PhysicalOp = { - PhysicalOp - .sourcePhysicalOp( - workflowId, - executionId, - operatorIdentifier, - OpExecInitInfo((_, _) => new CacheSourceOpExec(opResultStorage.get(targetSinkStorageId))) - ) - .withInputPorts(operatorInfo.inputPorts) - .withOutputPorts(operatorInfo.outputPorts) - .withPropagateSchema( - SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> sourceSchema())) - ) - } - - override def operatorInfo: OperatorInfo = - OperatorInfo( - "Cache Source Operator", - "Retrieve the cached output to src", - OperatorGroupConstants.UTILITY_GROUP, - inputPorts = List.empty, - outputPorts = List(OutputPort()) - ) -} From 64dd0c1e39a174c4b48b636e4688bc5c00fdeea8 Mon Sep 17 00:00:00 2001 From: GspikeHalo <109092664+GspikeHalo@users.noreply.github.com> Date: Tue, 17 Dec 2024 21:49:17 -0800 Subject: [PATCH 08/47] Modify the name 'community' to 'hub' (#3167) ### Purpose: Since "Hub" has been chosen as the more formal name for the community, the previous name needs to be replaced with "Hub." ### Change: The 'Community' in the left sidebar has been replaced with 'Hub'. ### Demo: Before: ![image](https://github.com/user-attachments/assets/29604593-7453-43d7-ab80-1f37637752f7) After: ![image](https://github.com/user-attachments/assets/ac41b971-9f10-4e35-b0a0-8ba37bd200a9) --- core/gui/src/app/dashboard/component/dashboard.component.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/core/gui/src/app/dashboard/component/dashboard.component.html b/core/gui/src/app/dashboard/component/dashboard.component.html index 84bf395c266..059146949be 100644 --- a/core/gui/src/app/dashboard/component/dashboard.component.html +++ b/core/gui/src/app/dashboard/component/dashboard.component.html @@ -21,7 +21,7 @@
  • From 7ae89dc0bc12b80d36255d84c3c8914cde78e1dd Mon Sep 17 00:00:00 2001 From: GspikeHalo <109092664+GspikeHalo@users.noreply.github.com> Date: Tue, 17 Dec 2024 22:15:05 -0800 Subject: [PATCH 09/47] Standardize the "is_public" column across the dataset and workflow tables. (#3168) ### Purpose: Although the column indicating publication already exists in both the dataset and workflow tables, they have different names and types. ### Change: Modified the dataset and workflow tables to unify the format of is_public across different tables. **To complete the database update, please execute `18.sql` located in the update folder.** --- .../WorkflowSearchQueryBuilder.scala | 4 ++-- .../hub/workflow/HubWorkflowResource.scala | 10 ++++---- .../workflow/WorkflowAccessResource.scala | 2 +- .../user/workflow/WorkflowResource.scala | 8 +++---- .../dao/jooq/generated/tables/Workflow.java | 6 ++--- .../generated/tables/daos/WorkflowDao.java | 12 +++++----- .../tables/interfaces/IWorkflow.java | 8 +++---- .../jooq/generated/tables/pojos/Workflow.java | 22 ++++++++--------- .../tables/records/WorkflowRecord.java | 24 +++++++++---------- core/scripts/sql/texera_ddl.sql | 8 ++++++- core/scripts/sql/update/18.sql | 7 ++++++ 11 files changed, 62 insertions(+), 49 deletions(-) create mode 100644 core/scripts/sql/update/18.sql diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/WorkflowSearchQueryBuilder.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/WorkflowSearchQueryBuilder.scala index db43562a742..5ade3dff372 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/WorkflowSearchQueryBuilder.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/WorkflowSearchQueryBuilder.scala @@ -49,12 +49,12 @@ object WorkflowSearchQueryBuilder extends SearchQueryBuilder { var condition: Condition = DSL.trueCondition() if (uid == null) { - condition = WORKFLOW.IS_PUBLISHED.eq(1.toByte) + condition = WORKFLOW.IS_PUBLIC.eq(1.toByte) } else { val privateAccessCondition = WORKFLOW_USER_ACCESS.UID.eq(uid).or(PROJECT_USER_ACCESS.UID.eq(uid)) if (includePublic) { - condition = privateAccessCondition.or(WORKFLOW.IS_PUBLISHED.eq(1.toByte)) + condition = privateAccessCondition.or(WORKFLOW.IS_PUBLIC.eq(1.toByte)) } else { condition = privateAccessCondition } diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/hub/workflow/HubWorkflowResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/hub/workflow/HubWorkflowResource.scala index 15022a658a8..f680e50b2cb 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/hub/workflow/HubWorkflowResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/hub/workflow/HubWorkflowResource.scala @@ -130,7 +130,7 @@ class HubWorkflowResource { def getPublishedWorkflowCount: Integer = { context.selectCount .from(WORKFLOW) - .where(WORKFLOW.IS_PUBLISHED.eq(1.toByte)) + .where(WORKFLOW.IS_PUBLIC.eq(1.toByte)) .fetchOne(0, classOf[Integer]) } @@ -174,7 +174,7 @@ class HubWorkflowResource { val workflow = workflowDao.ctx .selectFrom(WORKFLOW) .where(WORKFLOW.WID.eq(wid)) - .and(WORKFLOW.IS_PUBLISHED.isTrue) + .and(WORKFLOW.IS_PUBLIC.isTrue) .fetchOne() WorkflowWithPrivilege( workflow.getName, @@ -183,7 +183,7 @@ class HubWorkflowResource { workflow.getContent, workflow.getCreationTime, workflow.getLastModifiedTime, - workflow.getIsPublished, + workflow.getIsPublic, readonly = true ) } @@ -328,7 +328,7 @@ class HubWorkflowResource { .from(WORKFLOW_USER_LIKES) .join(WORKFLOW) .on(WORKFLOW_USER_LIKES.WID.eq(WORKFLOW.WID)) - .where(WORKFLOW.IS_PUBLISHED.eq(1.toByte)) + .where(WORKFLOW.IS_PUBLIC.eq(1.toByte)) .groupBy(WORKFLOW_USER_LIKES.WID) .orderBy(DSL.count(WORKFLOW_USER_LIKES.WID).desc()) .limit(8) @@ -350,7 +350,7 @@ class HubWorkflowResource { .from(WORKFLOW_USER_CLONES) .join(WORKFLOW) .on(WORKFLOW_USER_CLONES.WID.eq(WORKFLOW.WID)) - .where(WORKFLOW.IS_PUBLISHED.eq(1.toByte)) + .where(WORKFLOW.IS_PUBLIC.eq(1.toByte)) .groupBy(WORKFLOW_USER_CLONES.WID) .orderBy(DSL.count(WORKFLOW_USER_CLONES.WID).desc()) .limit(8) diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowAccessResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowAccessResource.scala index 9adce4f3eb5..53e9bc76bd2 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowAccessResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowAccessResource.scala @@ -83,7 +83,7 @@ object WorkflowAccessResource { def isPublic(wid: UInteger): Boolean = { context - .select(WORKFLOW.IS_PUBLISHED) + .select(WORKFLOW.IS_PUBLIC) .from(WORKFLOW) .where(WORKFLOW.WID.eq(wid)) .fetchOneInto(classOf[Boolean]) diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowResource.scala index 0b8e5c6f3fa..250fa7a5778 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowResource.scala @@ -332,7 +332,7 @@ class WorkflowResource extends LazyLogging { workflow.getContent, workflow.getCreationTime, workflow.getLastModifiedTime, - workflow.getIsPublished, + workflow.getIsPublic, !WorkflowAccessResource.hasWriteAccess(wid, user.getUid) ) } else { @@ -568,7 +568,7 @@ class WorkflowResource extends LazyLogging { @Path("/public/{wid}") def makePublic(@PathParam("wid") wid: UInteger, @Auth user: SessionUser): Unit = { val workflow: Workflow = workflowDao.fetchOneByWid(wid) - workflow.setIsPublished(1.toByte) + workflow.setIsPublic(1.toByte) workflowDao.update(workflow) } @@ -576,7 +576,7 @@ class WorkflowResource extends LazyLogging { @Path("/private/{wid}") def makePrivate(@PathParam("wid") wid: UInteger): Unit = { val workflow: Workflow = workflowDao.fetchOneByWid(wid) - workflow.setIsPublished(0.toByte) + workflow.setIsPublic(0.toByte) workflowDao.update(workflow) } @@ -584,7 +584,7 @@ class WorkflowResource extends LazyLogging { @Path("/type/{wid}") def getWorkflowType(@PathParam("wid") wid: UInteger): String = { val workflow: Workflow = workflowDao.fetchOneByWid(wid) - if (workflow.getIsPublished == 1.toByte) { + if (workflow.getIsPublic == 1.toByte) { "Public" } else { "Private" diff --git a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/Workflow.java b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/Workflow.java index 8b76e180bcb..e730b28063e 100644 --- a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/Workflow.java +++ b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/Workflow.java @@ -35,7 +35,7 @@ @SuppressWarnings({ "all", "unchecked", "rawtypes" }) public class Workflow extends TableImpl { - private static final long serialVersionUID = -256100701; + private static final long serialVersionUID = 1942522218; /** * The reference instance of texera_db.workflow @@ -81,9 +81,9 @@ public Class getRecordType() { public final TableField LAST_MODIFIED_TIME = createField(DSL.name("last_modified_time"), org.jooq.impl.SQLDataType.TIMESTAMP.nullable(false).defaultValue(org.jooq.impl.DSL.field("CURRENT_TIMESTAMP", org.jooq.impl.SQLDataType.TIMESTAMP)), this, ""); /** - * The column texera_db.workflow.is_published. + * The column texera_db.workflow.is_public. */ - public final TableField IS_PUBLISHED = createField(DSL.name("is_published"), org.jooq.impl.SQLDataType.TINYINT.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.TINYINT)), this, ""); + public final TableField IS_PUBLIC = createField(DSL.name("is_public"), org.jooq.impl.SQLDataType.TINYINT.nullable(false).defaultValue(org.jooq.impl.DSL.inline("0", org.jooq.impl.SQLDataType.TINYINT)), this, ""); /** * Create a texera_db.workflow table reference diff --git a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/daos/WorkflowDao.java b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/daos/WorkflowDao.java index 769d2b953ce..a17088023ee 100644 --- a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/daos/WorkflowDao.java +++ b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/daos/WorkflowDao.java @@ -132,16 +132,16 @@ public List fetchBy } /** - * Fetch records that have is_published BETWEEN lowerInclusive AND upperInclusive + * Fetch records that have is_public BETWEEN lowerInclusive AND upperInclusive */ - public List fetchRangeOfIsPublished(Byte lowerInclusive, Byte upperInclusive) { - return fetchRange(Workflow.WORKFLOW.IS_PUBLISHED, lowerInclusive, upperInclusive); + public List fetchRangeOfIsPublic(Byte lowerInclusive, Byte upperInclusive) { + return fetchRange(Workflow.WORKFLOW.IS_PUBLIC, lowerInclusive, upperInclusive); } /** - * Fetch records that have is_published IN (values) + * Fetch records that have is_public IN (values) */ - public List fetchByIsPublished(Byte... values) { - return fetch(Workflow.WORKFLOW.IS_PUBLISHED, values); + public List fetchByIsPublic(Byte... values) { + return fetch(Workflow.WORKFLOW.IS_PUBLIC, values); } } diff --git a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/interfaces/IWorkflow.java b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/interfaces/IWorkflow.java index c90b4bca901..36206e6a0d2 100644 --- a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/interfaces/IWorkflow.java +++ b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/interfaces/IWorkflow.java @@ -77,14 +77,14 @@ public interface IWorkflow extends Serializable { public Timestamp getLastModifiedTime(); /** - * Setter for texera_db.workflow.is_published. + * Setter for texera_db.workflow.is_public. */ - public void setIsPublished(Byte value); + public void setIsPublic(Byte value); /** - * Getter for texera_db.workflow.is_published. + * Getter for texera_db.workflow.is_public. */ - public Byte getIsPublished(); + public Byte getIsPublic(); // ------------------------------------------------------------------------- // FROM and INTO diff --git a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/pojos/Workflow.java b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/pojos/Workflow.java index 020ac283a7f..bdd7236748d 100644 --- a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/pojos/Workflow.java +++ b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/pojos/Workflow.java @@ -17,7 +17,7 @@ @SuppressWarnings({ "all", "unchecked", "rawtypes" }) public class Workflow implements IWorkflow { - private static final long serialVersionUID = -212123101; + private static final long serialVersionUID = 1973264418; private String name; private String description; @@ -25,7 +25,7 @@ public class Workflow implements IWorkflow { private String content; private Timestamp creationTime; private Timestamp lastModifiedTime; - private Byte isPublished; + private Byte isPublic; public Workflow() {} @@ -36,7 +36,7 @@ public Workflow(IWorkflow value) { this.content = value.getContent(); this.creationTime = value.getCreationTime(); this.lastModifiedTime = value.getLastModifiedTime(); - this.isPublished = value.getIsPublished(); + this.isPublic = value.getIsPublic(); } public Workflow( @@ -46,7 +46,7 @@ public Workflow( String content, Timestamp creationTime, Timestamp lastModifiedTime, - Byte isPublished + Byte isPublic ) { this.name = name; this.description = description; @@ -54,7 +54,7 @@ public Workflow( this.content = content; this.creationTime = creationTime; this.lastModifiedTime = lastModifiedTime; - this.isPublished = isPublished; + this.isPublic = isPublic; } @Override @@ -118,13 +118,13 @@ public void setLastModifiedTime(Timestamp lastModifiedTime) { } @Override - public Byte getIsPublished() { - return this.isPublished; + public Byte getIsPublic() { + return this.isPublic; } @Override - public void setIsPublished(Byte isPublished) { - this.isPublished = isPublished; + public void setIsPublic(Byte isPublic) { + this.isPublic = isPublic; } @Override @@ -137,7 +137,7 @@ public String toString() { sb.append(", ").append(content); sb.append(", ").append(creationTime); sb.append(", ").append(lastModifiedTime); - sb.append(", ").append(isPublished); + sb.append(", ").append(isPublic); sb.append(")"); return sb.toString(); @@ -155,7 +155,7 @@ public void from(IWorkflow from) { setContent(from.getContent()); setCreationTime(from.getCreationTime()); setLastModifiedTime(from.getLastModifiedTime()); - setIsPublished(from.getIsPublished()); + setIsPublic(from.getIsPublic()); } @Override diff --git a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/records/WorkflowRecord.java b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/records/WorkflowRecord.java index c2c62e9a7b9..218dffacf65 100644 --- a/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/records/WorkflowRecord.java +++ b/core/dao/src/main/scala/edu/uci/ics/texera/dao/jooq/generated/tables/records/WorkflowRecord.java @@ -23,7 +23,7 @@ @SuppressWarnings({ "all", "unchecked", "rawtypes" }) public class WorkflowRecord extends UpdatableRecordImpl implements Record7, IWorkflow { - private static final long serialVersionUID = 1544914961; + private static final long serialVersionUID = 1992208375; /** * Setter for texera_db.workflow.name. @@ -122,18 +122,18 @@ public Timestamp getLastModifiedTime() { } /** - * Setter for texera_db.workflow.is_published. + * Setter for texera_db.workflow.is_public. */ @Override - public void setIsPublished(Byte value) { + public void setIsPublic(Byte value) { set(6, value); } /** - * Getter for texera_db.workflow.is_published. + * Getter for texera_db.workflow.is_public. */ @Override - public Byte getIsPublished() { + public Byte getIsPublic() { return (Byte) get(6); } @@ -192,7 +192,7 @@ public Field field6() { @Override public Field field7() { - return Workflow.WORKFLOW.IS_PUBLISHED; + return Workflow.WORKFLOW.IS_PUBLIC; } @Override @@ -227,7 +227,7 @@ public Timestamp component6() { @Override public Byte component7() { - return getIsPublished(); + return getIsPublic(); } @Override @@ -262,7 +262,7 @@ public Timestamp value6() { @Override public Byte value7() { - return getIsPublished(); + return getIsPublic(); } @Override @@ -303,7 +303,7 @@ public WorkflowRecord value6(Timestamp value) { @Override public WorkflowRecord value7(Byte value) { - setIsPublished(value); + setIsPublic(value); return this; } @@ -331,7 +331,7 @@ public void from(IWorkflow from) { setContent(from.getContent()); setCreationTime(from.getCreationTime()); setLastModifiedTime(from.getLastModifiedTime()); - setIsPublished(from.getIsPublished()); + setIsPublic(from.getIsPublic()); } @Override @@ -354,7 +354,7 @@ public WorkflowRecord() { /** * Create a detached, initialised WorkflowRecord */ - public WorkflowRecord(String name, String description, UInteger wid, String content, Timestamp creationTime, Timestamp lastModifiedTime, Byte isPublished) { + public WorkflowRecord(String name, String description, UInteger wid, String content, Timestamp creationTime, Timestamp lastModifiedTime, Byte isPublic) { super(Workflow.WORKFLOW); set(0, name); @@ -363,6 +363,6 @@ public WorkflowRecord(String name, String description, UInteger wid, String cont set(3, content); set(4, creationTime); set(5, lastModifiedTime); - set(6, isPublished); + set(6, isPublic); } } diff --git a/core/scripts/sql/texera_ddl.sql b/core/scripts/sql/texera_ddl.sql index a83ea62bc17..01af5ea1623 100644 --- a/core/scripts/sql/texera_ddl.sql +++ b/core/scripts/sql/texera_ddl.sql @@ -243,4 +243,10 @@ CREATE TABLE IF NOT EXISTS workflow_view_count `view_count` INT UNSIGNED NOT NULL DEFAULT 0, PRIMARY KEY (`wid`), FOREIGN KEY (`wid`) REFERENCES `workflow` (`wid`) ON DELETE CASCADE - ) ENGINE = INNODB; \ No newline at end of file + ) ENGINE = INNODB; + +ALTER TABLE dataset +MODIFY COLUMN is_public BOOLEAN NOT NULL DEFAULT true; + +ALTER TABLE workflow +CHANGE COLUMN is_published is_public BOOLEAN NOT NULL DEFAULT false; \ No newline at end of file diff --git a/core/scripts/sql/update/18.sql b/core/scripts/sql/update/18.sql new file mode 100644 index 00000000000..4e3931fe8a7 --- /dev/null +++ b/core/scripts/sql/update/18.sql @@ -0,0 +1,7 @@ +USE texera_db; + +ALTER TABLE dataset +MODIFY COLUMN is_public BOOLEAN NOT NULL DEFAULT true; + +ALTER TABLE workflow +CHANGE COLUMN is_published is_public BOOLEAN NOT NULL DEFAULT false; \ No newline at end of file From 55240994b66e3d43a3250d69f970635885a655b2 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Wed, 18 Dec 2024 16:59:27 -0800 Subject: [PATCH 10/47] Move output mode on port (#3169) This PR unifies the design of the output mode, and make it a property on an output port. Previously we had two modes: 1. SET_SNAPSHOT (return a snapshot of a table) 2. SET_DELTA (return a delta of tuples) And different chart types (e.g., HTML, bar chart, line chart). However, we are only using HTML type after switching to pyplot. Additionally, the output mode and chart type are associated with a logical operator, and passed along to the downstream sink operator, this does not support multiple output ports operators. ### New design 1. Move OutputMode onto an output port's property. 2. Unify to three modes: a. SET_SNAPSHOT (return a snapshot of a table) b. SET_DELTA (return a delta of tuples) c. SINGLE_SNAPSHOT (only used for visualizations to return a html) The SINGLE_SNAPSHOT is needed now as we need a way to differenciate a HTML output vs a normal data table output. This is due to the storage with mongo is limited by 16 mb, and HTMLs are usually larger than 16 mb. After we remove this limitation on the storage, we will remove the SINGLE_SNAPSHOT and fall back to SET_SNAPSHOT. --- .../architecture/worker/WorkerResult.scala | 9 --- .../web/service/ExecutionResultService.scala | 60 +++++++------------ .../workflow/SinkInjectionTransformer.scala | 12 +--- .../texera/workflow/WorkflowCompiler.scala | 4 +- .../execute-workflow/mock-result-data.ts | 9 --- .../util/SinkInjectionTransformer.scala | 12 +--- .../protobuf/edu/uci/ics/amber/workflow.proto | 13 ++++ .../filter/SpecializedFilterOpDesc.java | 2 +- .../operator/sink/IncrementalOutputMode.java | 8 --- .../sink/managed/ProgressiveSinkOpDesc.java | 30 +++------- .../sink/managed/ProgressiveSinkOpExec.scala | 9 +-- .../typecasting/TypeCastingOpDesc.java | 2 +- .../source/PythonUDFSourceOpDescV2.java | 2 +- .../operator/udf/r/RUDFSourceOpDesc.java | 2 +- .../visualization/DotPlot/DotPlotOpDesc.scala | 8 +-- .../IcicleChart/IcicleChartOpDesc.scala | 8 +-- .../ImageViz/ImageVisualizerOpDesc.scala | 8 +-- .../ScatterMatrixChartOpDesc.scala | 8 +-- .../visualization/VisualizationConstants.java | 15 ----- .../visualization/VisualizationOperator.java | 21 ------- .../barChart/BarChartOpDesc.scala | 8 +-- .../visualization/boxPlot/BoxPlotOpDesc.scala | 8 +-- .../bubbleChart/BubbleChartOpDesc.scala | 8 +-- .../CandlestickChartOpDesc.scala | 7 +-- .../ContinuousErrorBandsOpDesc.scala | 8 +-- .../contourPlot/ContourPlotOpDesc.scala | 8 +-- .../dumbbellPlot/DumbbellPlotOpDesc.scala | 8 +-- .../FigureFactoryTableOpDesc.scala | 8 +-- .../filledAreaPlot/FilledAreaPlotOpDesc.scala | 8 +-- .../funnelPlot/FunnelPlotOpDesc.scala | 8 +-- .../ganttChart/GanttChartOpDesc.scala | 9 +-- .../visualization/heatMap/HeatMapOpDesc.scala | 7 +-- .../hierarchychart/HierarchyChartOpDesc.scala | 11 ++-- .../histogram/HistogramChartOpDesc.scala | 7 +-- .../visualization/htmlviz/HtmlVizOpDesc.scala | 9 ++- .../lineChart/LineChartOpDesc.scala | 8 +-- .../pieChart/PieChartOpDesc.scala | 8 +-- .../quiverPlot/QuiverPlotOpDesc.scala | 7 +-- .../sankeyDiagram/SankeyDiagramOpDesc.scala | 9 +-- .../scatter3DChart/Scatter3dChartOpDesc.scala | 9 +-- .../scatterplot/ScatterplotOpDesc.scala | 8 +-- .../tablesChart/TablesPlotOpDesc.scala | 8 +-- .../ternaryPlot/TernaryPlotOpDesc.scala | 9 +-- .../visualization/urlviz/UrlVizOpDesc.scala | 9 ++- ...pDesc.scala => WaterfallChartOpDesc.scala} | 8 +-- .../wordCloud/WordCloudOpDesc.scala | 13 ++-- 46 files changed, 155 insertions(+), 305 deletions(-) delete mode 100644 core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerResult.scala delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/IncrementalOutputMode.java delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/VisualizationConstants.java delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/VisualizationOperator.java rename core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/{WaterfallOpDesc.scala => WaterfallChartOpDesc.scala} (90%) diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerResult.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerResult.scala deleted file mode 100644 index 055c986ff21..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerResult.scala +++ /dev/null @@ -1,9 +0,0 @@ -package edu.uci.ics.amber.engine.architecture.worker - -import edu.uci.ics.amber.core.tuple.Tuple -import edu.uci.ics.amber.operator.sink.IncrementalOutputMode - -case class WorkerResult( - outputMode: IncrementalOutputMode, - result: List[Tuple] -) diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala index 195a8dd5cd3..1c091600db7 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala @@ -19,13 +19,12 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregat KILLED, RUNNING } -import edu.uci.ics.amber.operator.sink.IncrementalOutputMode.{SET_DELTA, SET_SNAPSHOT} import edu.uci.ics.amber.engine.common.client.AmberClient import edu.uci.ics.amber.engine.common.executionruntimestate.ExecutionMetadataStore import edu.uci.ics.amber.engine.common.{AmberConfig, AmberRuntime} -import edu.uci.ics.amber.operator.sink.IncrementalOutputMode import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc import edu.uci.ics.amber.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.texera.web.SubscriptionManager import edu.uci.ics.texera.web.model.websocket.event.{ PaginatedResultEvent, @@ -33,7 +32,6 @@ import edu.uci.ics.texera.web.model.websocket.event.{ WebResultUpdateEvent } import edu.uci.ics.texera.web.model.websocket.request.ResultPaginationRequest -import edu.uci.ics.texera.web.service.ExecutionResultService.WebResultUpdate import edu.uci.ics.texera.web.storage.{ExecutionStateStore, WorkflowStateStore} import edu.uci.ics.texera.workflow.LogicalPlan @@ -43,28 +41,17 @@ import scala.concurrent.duration.DurationInt object ExecutionResultService { - val defaultPageSize: Int = 5 - - // convert Tuple from engine's format to JSON format - def webDataFromTuple( - mode: WebOutputMode, - table: List[Tuple], - chartType: Option[String] - ): WebDataUpdate = { - val tableInJson = table.map(t => t.asKeyValuePairJson()) - WebDataUpdate(mode, tableInJson, chartType) - } + private val defaultPageSize: Int = 5 /** * convert Tuple from engine's format to JSON format */ private def tuplesToWebData( mode: WebOutputMode, - table: List[Tuple], - chartType: Option[String] + table: List[Tuple] ): WebDataUpdate = { val tableInJson = table.map(t => t.asKeyValuePairJson()) - WebDataUpdate(mode, tableInJson, chartType) + WebDataUpdate(mode, tableInJson) } /** @@ -75,41 +62,40 @@ object ExecutionResultService { * * Produces the WebResultUpdate to send to frontend from a result update from the engine. */ - def convertWebResultUpdate( + private def convertWebResultUpdate( sink: ProgressiveSinkOpDesc, oldTupleCount: Int, newTupleCount: Int ): WebResultUpdate = { val webOutputMode: WebOutputMode = { - (sink.getOutputMode, sink.getChartType) match { - // visualization sinks use its corresponding mode - case (SET_SNAPSHOT, Some(_)) => SetSnapshotMode() - case (SET_DELTA, Some(_)) => SetDeltaMode() - // Non-visualization sinks use pagination mode - case (_, None) => PaginationMode() + sink.getOutputMode match { + // currently, only table outputs are using these modes + case OutputMode.SET_DELTA => SetDeltaMode() + case OutputMode.SET_SNAPSHOT => PaginationMode() + + // currently, only visualizations are using single snapshot mode + case OutputMode.SINGLE_SNAPSHOT => SetSnapshotMode() } } val storage = ResultStorage.getOpResultStorage(sink.getContext.workflowId).get(sink.getUpstreamId.get) - val webUpdate = (webOutputMode, sink.getOutputMode) match { - case (PaginationMode(), SET_SNAPSHOT) => + val webUpdate = webOutputMode match { + case PaginationMode() => val numTuples = storage.getCount val maxPageIndex = - Math.ceil(numTuples / ExecutionResultService.defaultPageSize.toDouble).toInt + Math.ceil(numTuples / defaultPageSize.toDouble).toInt WebPaginationUpdate( PaginationMode(), newTupleCount, (1 to maxPageIndex).toList ) - case (SetSnapshotMode(), SET_SNAPSHOT) => - tuplesToWebData(webOutputMode, storage.get().toList, sink.getChartType) - case (SetDeltaMode(), SET_DELTA) => + case SetSnapshotMode() => + tuplesToWebData(webOutputMode, storage.get().toList) + case SetDeltaMode() => val deltaList = storage.getAfter(oldTupleCount).toList - tuplesToWebData(webOutputMode, deltaList, sink.getChartType) + tuplesToWebData(webOutputMode, deltaList) - // currently not supported mode combinations - // (PaginationMode, SET_DELTA) | (DataSnapshotMode, SET_DELTA) | (DataDeltaMode, SET_SNAPSHOT) case _ => throw new RuntimeException( "update mode combination not supported: " + (webOutputMode, sink.getOutputMode) @@ -152,8 +138,8 @@ object ExecutionResultService { dirtyPageIndices: List[Int] ) extends WebResultUpdate - case class WebDataUpdate(mode: WebOutputMode, table: List[ObjectNode], chartType: Option[String]) - extends WebResultUpdate + case class WebDataUpdate(mode: WebOutputMode, table: List[ObjectNode]) extends WebResultUpdate + } /** @@ -227,7 +213,7 @@ class ExecutionResultService( addSubscription( workflowStateStore.resultStore.registerDiffHandler((oldState, newState) => { - val buf = mutable.HashMap[String, WebResultUpdate]() + val buf = mutable.HashMap[String, ExecutionResultService.WebResultUpdate]() val allTableStats = mutable.Map[String, Map[String, Map[String, Any]]]() newState.resultInfo .filter(info => { @@ -332,7 +318,7 @@ class ExecutionResultService( .toInt val mode = sink.getOutputMode val changeDetector = - if (mode == IncrementalOutputMode.SET_SNAPSHOT) { + if (mode == OutputMode.SET_SNAPSHOT) { UUID.randomUUID.toString } else "" (id, OperatorResultMetadata(count, changeDetector)) diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala index 686017ea8d3..80f0c3e290f 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala @@ -2,7 +2,6 @@ package edu.uci.ics.texera.workflow import edu.uci.ics.amber.operator.sink.SinkOpDesc import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc -import edu.uci.ics.amber.operator.visualization.VisualizationOperator import edu.uci.ics.amber.virtualidentity.OperatorIdentity import edu.uci.ics.amber.workflow.PortIdentity @@ -61,14 +60,9 @@ object SinkInjectionTransformer { sinkOp.setUpstreamPort(edge.get.fromPortId.id) // set output mode for visualization operator - (upstream.get, sinkOp) match { - // match the combination of a visualization operator followed by a sink operator - case (viz: VisualizationOperator, sink: ProgressiveSinkOpDesc) => - sink.setOutputMode(viz.outputMode()) - sink.setChartType(viz.chartType()) - case _ => - //skip - } + val outputPort = + upstream.get.operatorInfo.outputPorts.find(port => port.id == edge.get.fromPortId).get + sinkOp.setOutputMode(outputPort.mode) } }) diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala index bc03eaa0f21..9dd0bdc14f7 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala @@ -8,8 +8,8 @@ import edu.uci.ics.amber.core.workflow.{PhysicalPlan, WorkflowContext} import edu.uci.ics.amber.engine.architecture.controller.Workflow import edu.uci.ics.amber.engine.common.Utils.objectMapper import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc -import edu.uci.ics.amber.operator.visualization.VisualizationConstants import edu.uci.ics.amber.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.workflow.OutputPort.OutputMode.SINGLE_SNAPSHOT import edu.uci.ics.amber.workflow.PhysicalLink import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError import edu.uci.ics.texera.web.model.websocket.request.LogicalPlanPojo @@ -202,7 +202,7 @@ class WorkflowCompiler( // due to the size limit of single document in mongoDB (16MB) // for sinks visualizing HTMLs which could possibly be large in size, we always use the memory storage. val storageType = { - if (sink.getChartType.contains(VisualizationConstants.HTML_VIZ)) OpResultStorage.MEMORY + if (sink.getOutputMode == SINGLE_SNAPSHOT) OpResultStorage.MEMORY else OpResultStorage.defaultStorageMode } if (!reuseStorageSet.contains(storageKey) || !storage.contains(storageKey)) { diff --git a/core/gui/src/app/workspace/service/execute-workflow/mock-result-data.ts b/core/gui/src/app/workspace/service/execute-workflow/mock-result-data.ts index 3eb4a692b33..bb263ceba41 100644 --- a/core/gui/src/app/workspace/service/execute-workflow/mock-result-data.ts +++ b/core/gui/src/app/workspace/service/execute-workflow/mock-result-data.ts @@ -1,6 +1,5 @@ import { WebDataUpdate, WebPaginationUpdate } from "../../types/execute-workflow.interface"; import { Point, OperatorPredicate } from "../../types/workflow-common.interface"; -import { PaginatedResultEvent } from "../../types/workflow-websocket.interface"; import { IndexableObject } from "ng-zorro-antd/core/types"; export const mockData: IndexableObject[] = [ @@ -44,7 +43,6 @@ export const mockData: IndexableObject[] = [ export const mockResultSnapshotUpdate: WebDataUpdate = { mode: { type: "SetSnapshotMode" }, - chartType: undefined, table: mockData, }; @@ -54,13 +52,6 @@ export const mockResultPaginationUpdate: WebPaginationUpdate = { totalNumTuples: mockData.length, }; -export const paginationResponse: PaginatedResultEvent = { - requestID: "requestID", - operatorID: "operator-sink", - pageIndex: 1, - table: mockData, -}; - export const mockResultOperator: OperatorPredicate = { operatorID: "operator-sink", operatorType: "ViewResults", diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/util/SinkInjectionTransformer.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/util/SinkInjectionTransformer.scala index 9d2c385a8a7..75d952590c0 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/util/SinkInjectionTransformer.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/util/SinkInjectionTransformer.scala @@ -3,7 +3,6 @@ package edu.uci.ics.amber.compiler.util import edu.uci.ics.amber.compiler.model.LogicalPlan import edu.uci.ics.amber.operator.sink.SinkOpDesc import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc -import edu.uci.ics.amber.operator.visualization.VisualizationOperator import edu.uci.ics.amber.virtualidentity.OperatorIdentity import edu.uci.ics.amber.workflow.PortIdentity @@ -62,14 +61,9 @@ object SinkInjectionTransformer { sinkOp.setUpstreamPort(edge.get.fromPortId.id) // set output mode for visualization operator - (upstream.get, sinkOp) match { - // match the combination of a visualization operator followed by a sink operator - case (viz: VisualizationOperator, sink: ProgressiveSinkOpDesc) => - sink.setOutputMode(viz.outputMode()) - sink.setChartType(viz.chartType()) - case _ => - //skip - } + val outputPort = + upstream.get.operatorInfo.outputPorts.find(port => port.id == edge.get.fromPortId).get + sinkOp.setOutputMode(outputPort.mode) } }) diff --git a/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflow.proto b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflow.proto index 5d279c4ad58..0ee4c68d36a 100644 --- a/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflow.proto +++ b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflow.proto @@ -23,10 +23,23 @@ message InputPort { repeated PortIdentity dependencies = 4; } + + message OutputPort { + enum OutputMode { + // outputs complete result set snapshot for each update + SET_SNAPSHOT = 0; + // outputs incremental result set delta for each update + SET_DELTA = 1; + // outputs a single snapshot for the entire execution, + // used explicitly to support visualization operators that may exceed the memory limit + // TODO: remove this mode after we have a better solution for output size limit + SINGLE_SNAPSHOT = 2; + } PortIdentity id = 1 [(scalapb.field).no_box = true]; string displayName = 2; bool blocking = 3; + OutputMode mode = 4; } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.java index 185c1c1777c..ea35f40dee0 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.java +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.java @@ -48,7 +48,7 @@ public OperatorInfo operatorInfo() { "Performs a filter operation", OperatorGroupConstants.CLEANING_GROUP(), asScala(singletonList(new InputPort(new PortIdentity(0, false), "", false, asScala(new ArrayList()).toSeq()))).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false))).toList(), + asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), false, false, true, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/IncrementalOutputMode.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/IncrementalOutputMode.java deleted file mode 100644 index a02c71abed4..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/IncrementalOutputMode.java +++ /dev/null @@ -1,8 +0,0 @@ -package edu.uci.ics.amber.operator.sink; - -public enum IncrementalOutputMode { - // sink outputs complete result set snapshot for each update - SET_SNAPSHOT, - // sink outputs incremental result set delta for each update - SET_DELTA -} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpDesc.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpDesc.java index ef621a9db9e..1f63fb86ab4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpDesc.java +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpDesc.java @@ -9,7 +9,6 @@ import edu.uci.ics.amber.core.workflow.SchemaPropagationFunc; import edu.uci.ics.amber.operator.metadata.OperatorGroupConstants; import edu.uci.ics.amber.operator.metadata.OperatorInfo; -import edu.uci.ics.amber.operator.sink.IncrementalOutputMode; import edu.uci.ics.amber.operator.sink.ProgressiveUtils; import edu.uci.ics.amber.operator.sink.SinkOpDesc; import edu.uci.ics.amber.operator.util.OperatorDescriptorUtils; @@ -27,7 +26,7 @@ import java.util.ArrayList; import java.util.function.Function; -import static edu.uci.ics.amber.operator.sink.IncrementalOutputMode.SET_SNAPSHOT; + import static java.util.Collections.singletonList; import static scala.jdk.javaapi.CollectionConverters.asScala; @@ -36,11 +35,7 @@ public class ProgressiveSinkOpDesc extends SinkOpDesc { // use SET_SNAPSHOT as the default output mode // this will be set internally by the workflow compiler @JsonIgnore - private IncrementalOutputMode outputMode = SET_SNAPSHOT; - - // whether this sink corresponds to a visualization result, default is no - @JsonIgnore - private Option chartType = Option.empty(); + private OutputPort.OutputMode outputMode = OutputPort.OutputMode$.MODULE$.fromValue(0); // corresponding upstream operator ID and output port, will be set by workflow compiler @@ -73,7 +68,7 @@ public PhysicalOp getPhysicalOp(WorkflowIdentity workflowId, ExecutionIdentity e // SET_SNAPSHOT: Schema outputSchema; - if (this.outputMode.equals(SET_SNAPSHOT)) { + if (this.outputMode.equals(OutputPort.OutputMode$.MODULE$.fromValue(0))) { if (inputSchema.containsAttribute(ProgressiveUtils.insertRetractFlagAttr().getName())) { // input is insert/retract delta: the flag column is removed in output outputSchema = Schema.builder().add(inputSchema) @@ -101,7 +96,7 @@ public OperatorInfo operatorInfo() { "View the results", OperatorGroupConstants.UTILITY_GROUP(), asScala(singletonList(new InputPort(new PortIdentity(0, false), "", false, asScala(new ArrayList()).toSeq()))).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false))).toList(), + asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), false, false, false, @@ -114,7 +109,7 @@ public Schema getOutputSchema(Schema[] schemas) { Schema inputSchema = schemas[0]; // SET_SNAPSHOT: - if (this.outputMode.equals(SET_SNAPSHOT)) { + if (this.outputMode.equals(OutputPort.OutputMode$.MODULE$.fromValue(0))) { if (inputSchema.containsAttribute(ProgressiveUtils.insertRetractFlagAttr().getName())) { // input is insert/retract delta: the flag column is removed in output return Schema.builder().add(inputSchema) @@ -130,26 +125,15 @@ public Schema getOutputSchema(Schema[] schemas) { } @JsonIgnore - public IncrementalOutputMode getOutputMode() { + public OutputPort.OutputMode getOutputMode() { return outputMode; } @JsonIgnore - public void setOutputMode(IncrementalOutputMode outputMode) { + public void setOutputMode(OutputPort.OutputMode outputMode) { this.outputMode = outputMode; } - @JsonIgnore - public Option getChartType() { - return this.chartType; - } - - @JsonIgnore - public void setChartType(String chartType) { - this.chartType = Option.apply(chartType); - } - - @JsonIgnore public Option getUpstreamId() { return upstreamId; diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala index a58fe0930f4..a9408013d67 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala @@ -4,12 +4,13 @@ import edu.uci.ics.amber.core.executor.SinkOperatorExecutor import edu.uci.ics.amber.core.storage.model.BufferedItemWriter import edu.uci.ics.amber.core.storage.result.ResultStorage import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} -import edu.uci.ics.amber.operator.sink.{IncrementalOutputMode, ProgressiveUtils} +import edu.uci.ics.amber.operator.sink.ProgressiveUtils import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, WorkflowIdentity} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.PortIdentity class ProgressiveSinkOpExec( - outputMode: IncrementalOutputMode, + outputMode: OutputMode, storageKey: String, workflowIdentity: WorkflowIdentity ) extends SinkOperatorExecutor { @@ -25,8 +26,8 @@ class ProgressiveSinkOpExec( input: Int ): Unit = { outputMode match { - case IncrementalOutputMode.SET_SNAPSHOT => updateSetSnapshot(tuple) - case IncrementalOutputMode.SET_DELTA => writer.putOne(tuple) + case OutputMode.SET_SNAPSHOT | OutputMode.SINGLE_SNAPSHOT => updateSetSnapshot(tuple) + case OutputMode.SET_DELTA => writer.putOne(tuple) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.java index f1a55ab05fa..87d902f01f9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.java +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.java @@ -75,7 +75,7 @@ public OperatorInfo operatorInfo() { "Cast between types", OperatorGroupConstants.CLEANING_GROUP(), asScala(singletonList(new InputPort(new PortIdentity(0, false), "", false, asScala(new ArrayList()).toSeq()))).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false))).toList(), + asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), false, false, false, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.java index cb6a3b940ab..944f171c397 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.java +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.java @@ -99,7 +99,7 @@ public OperatorInfo operatorInfo() { "User-defined function operator in Python script", OperatorGroupConstants.PYTHON_GROUP(), asScala(new ArrayList()).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false))).toList(), + asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), false, false, true, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.java index 294da9a570d..2b5785d9468 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.java +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.java @@ -108,7 +108,7 @@ public OperatorInfo operatorInfo() { "User-defined function operator in R script", OperatorGroupConstants.R_GROUP(), asScala(new ArrayList()).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false))).toList(), + asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), false, false, false, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala index b90d449ef68..5cf03a51a6c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala @@ -4,12 +4,12 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} -class DotPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class DotPlotOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "Count Attribute", required = true) @JsonSchemaTitle("Count Attribute") @@ -27,7 +27,7 @@ class DotPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor "Visualize data using a dot plot", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createPlotlyFigure(): String = { @@ -74,6 +74,4 @@ class DotPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor finalCode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala index 24d20cfaf98..bad227b992e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} import edu.uci.ics.amber.operator.visualization.hierarchychart.HierarchySection +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} // type constraint: value can only be numeric @@ -20,7 +20,7 @@ import edu.uci.ics.amber.workflow.{InputPort, OutputPort} } } """) -class IcicleChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class IcicleChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("Hierarchy Path") @JsonPropertyDescription( @@ -44,7 +44,7 @@ class IcicleChartOpDesc extends VisualizationOperator with PythonOperatorDescrip "Visualize hierarchical data from root to leaves", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) private def getIcicleAttributesInPython: String = @@ -104,6 +104,4 @@ class IcicleChartOpDesc extends VisualizationOperator with PythonOperatorDescrip finalCode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala index 3db6ca8469d..cbb10911101 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala @@ -3,12 +3,12 @@ package edu.uci.ics.amber.operator.visualization.ImageViz import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -class ImageVisualizerOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class ImageVisualizerOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("image content column") @@ -26,7 +26,7 @@ class ImageVisualizerOpDesc extends VisualizationOperator with PythonOperatorDes "visualize image content", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createBinaryData(): String = { @@ -72,6 +72,4 @@ class ImageVisualizerOpDesc extends VisualizationOperator with PythonOperatorDes finalCode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala index 6b0e6c433b6..8e928f877bb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala @@ -9,8 +9,8 @@ import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, AutofillAttributeNameList } -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} import edu.uci.ics.amber.operator.PythonOperatorDescriptor +import edu.uci.ics.amber.workflow.OutputPort.OutputMode @JsonSchemaInject(json = """ { "attributeTypeRules": { @@ -20,7 +20,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor } } """) -class ScatterMatrixChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class ScatterMatrixChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "Selected Attributes", required = true) @JsonSchemaTitle("Selected Attributes") @@ -44,7 +44,7 @@ class ScatterMatrixChartOpDesc extends VisualizationOperator with PythonOperator "Visualize datasets in a Scatter Matrix", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createPlotlyFigure(): String = { @@ -81,6 +81,4 @@ class ScatterMatrixChartOpDesc extends VisualizationOperator with PythonOperator finalcode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/VisualizationConstants.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/VisualizationConstants.java deleted file mode 100644 index d0fc1baa9d7..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/VisualizationConstants.java +++ /dev/null @@ -1,15 +0,0 @@ -package edu.uci.ics.amber.operator.visualization; - -public class VisualizationConstants { - public static final String BAR = "bar"; - - public static final String LINE = "line"; - public static final String SPLINE = "spline"; - - public static final String PIE = "pie"; - public static final String DONUT = "donut"; - public static final String HTML_VIZ = "HTML visualizer"; - public static final String SIMPLE_SCATTERPLOT = "scatter"; - public static final String SPATIAL_SCATTERPLOT = "spatial scatterplot"; - public static final String WORD_CLOUD = "word cloud"; -} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/VisualizationOperator.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/VisualizationOperator.java deleted file mode 100644 index fc4e5c1f808..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/VisualizationOperator.java +++ /dev/null @@ -1,21 +0,0 @@ -package edu.uci.ics.amber.operator.visualization; - - -import edu.uci.ics.amber.operator.LogicalOp; -import edu.uci.ics.amber.operator.sink.IncrementalOutputMode; - -/** - * Base class for visualization operators. Visualization Operators should precede ViewResult Operator. - * Author: Mingji Han, Xiaozhen Liu - */ -public abstract class VisualizationOperator extends LogicalOp { - - public abstract String chartType(); - - // visualization operators have SET_SNAPSHOT incremental output mode by default - // an operator can override this option if it wishes to output in other incremental output mode - public IncrementalOutputMode outputMode() { - return IncrementalOutputMode.SET_SNAPSHOT; - } - -} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala index f9e57c77eb4..64e5d6f8a60 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} //type constraint: value can only be numeric @@ -19,7 +19,7 @@ import edu.uci.ics.amber.workflow.{InputPort, OutputPort} } } """) -class BarChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class BarChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "value", required = true) @JsonSchemaTitle("Value Column") @@ -60,7 +60,7 @@ class BarChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor "Visualize data in a Bar Chart", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def manipulateTable(): String = { @@ -126,6 +126,4 @@ class BarChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor finalCode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala index fc1b4368efc..7e48e5cd43e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala @@ -3,10 +3,10 @@ package edu.uci.ics.amber.operator.visualization.boxPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} import edu.uci.ics.amber.operator.PythonOperatorDescriptor @JsonSchemaInject(json = """ { @@ -17,7 +17,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor } } """) -class BoxPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class BoxPlotOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "value", required = true) @JsonSchemaTitle("Value Column") @@ -47,7 +47,7 @@ class BoxPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor "Visualize data in a Box Plot. Boxplots are drawn as a box with a vertical line down the middle which is mean value, and has horizontal lines attached to each side (known as “whiskers”).", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def manipulateTable(): String = { @@ -111,6 +111,4 @@ class BoxPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor finalCode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala index 7deb1ce83c8..58c1a916e0e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala @@ -4,10 +4,10 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} /** * Visualization Operator to visualize results as a Bubble Chart @@ -16,7 +16,7 @@ import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, Visuali */ // type can be numerical only -class BubbleChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class BubbleChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "xValue", required = true) @JsonSchemaTitle("X-Column") @@ -43,8 +43,6 @@ class BubbleChartOpDesc extends VisualizationOperator with PythonOperatorDescrip @JsonPropertyDescription("Picks data column to color bubbles with if color is enabled") @AutofillAttributeName var colorCategory: String = "" - override def chartType: String = VisualizationConstants.HTML_VIZ - override def getOutputSchema(schemas: Array[Schema]): Schema = { Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() } @@ -55,7 +53,7 @@ class BubbleChartOpDesc extends VisualizationOperator with PythonOperatorDescrip "a 3D Scatter Plot; Bubbles are graphed using x and y labels, and their sizes determined by a z-value.", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def manipulateTable(): String = { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala index bf5b5ed3b59..39085aa76e1 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala @@ -6,10 +6,10 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} -class CandlestickChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class CandlestickChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "date", required = true) @JsonSchemaTitle("Date Column") @@ -51,7 +51,7 @@ class CandlestickChartOpDesc extends VisualizationOperator with PythonOperatorDe "Visualize data in a Candlestick Chart", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) override def generatePythonCode(): String = { @@ -84,5 +84,4 @@ class CandlestickChartOpDesc extends VisualizationOperator with PythonOperatorDe |""".stripMargin } - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala index 64ab936a209..189d238ee3b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala @@ -5,12 +5,12 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import java.util import scala.jdk.CollectionConverters.ListHasAsScala -class ContinuousErrorBandsOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class ContinuousErrorBandsOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "xLabel", required = false, defaultValue = "X Axis") @JsonSchemaTitle("X Label") @@ -35,7 +35,7 @@ class ContinuousErrorBandsOpDesc extends VisualizationOperator with PythonOperat "Visualize error or uncertainty along a continuous line", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createPlotlyFigure(): String = { @@ -127,6 +127,4 @@ class ContinuousErrorBandsOpDesc extends VisualizationOperator with PythonOperat |""".stripMargin finalCode } - - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala index 45893aa34df..86854f41685 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala @@ -6,10 +6,10 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} -class ContourPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class ContourPlotOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "x", required = true) @JsonSchemaTitle("x") @@ -56,7 +56,7 @@ class ContourPlotOpDesc extends VisualizationOperator with PythonOperatorDescrip "Displays terrain or gradient variations in a Contour Plot", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) override def generatePythonCode(): String = { @@ -92,6 +92,4 @@ class ContourPlotOpDesc extends VisualizationOperator with PythonOperatorDescrip | yield {'html-content': html} |""".stripMargin } - - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala index ba72502d675..340e9768bec 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala @@ -3,10 +3,10 @@ package edu.uci.ics.amber.operator.visualization.dumbbellPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} import java.util import scala.jdk.CollectionConverters.CollectionHasAsScala @@ -21,7 +21,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor } } """) -class DumbbellPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class DumbbellPlotOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "categoryColumnName", required = true) @JsonSchemaTitle("Category Column Name") @@ -69,7 +69,7 @@ class DumbbellPlotOpDesc extends VisualizationOperator with PythonOperatorDescri "Visualize data in a Dumbbell Plots. A dumbbell plots (also known as a lollipop chart) is typically used to compare two distinct values or time points for the same entity.", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createPlotlyDumbbellLineFigure(): String = { @@ -163,6 +163,4 @@ class DumbbellPlotOpDesc extends VisualizationOperator with PythonOperatorDescri | |""".stripMargin } - - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala index def42f80ec7..862fe472b32 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala @@ -5,9 +5,9 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} -class FigureFactoryTableOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class FigureFactoryTableOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = false) @JsonSchemaTitle("Font Size") @@ -100,12 +100,10 @@ class FigureFactoryTableOpDesc extends VisualizationOperator with PythonOperator "Visualize data in a figure factory table", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) } - override def chartType(): String = VisualizationConstants.HTML_VIZ - override def getOutputSchema(schemas: Array[Schema]): Schema = { Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala index b2dde3aeb26..bb85ae82741 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala @@ -7,9 +7,9 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode -class FilledAreaPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class FilledAreaPlotOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("X-axis Attribute") @@ -56,7 +56,7 @@ class FilledAreaPlotOpDesc extends VisualizationOperator with PythonOperatorDesc "Visualize data in filled area plot", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createPlotlyFigure(): String = { @@ -143,6 +143,4 @@ class FilledAreaPlotOpDesc extends VisualizationOperator with PythonOperatorDesc finalCode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala index 2509811e994..91c75660045 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} @JsonSchemaInject(json = """ { @@ -15,7 +15,7 @@ import edu.uci.ics.amber.workflow.{InputPort, OutputPort} } } """) -class FunnelPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class FunnelPlotOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("X Column") @@ -45,7 +45,7 @@ class FunnelPlotOpDesc extends VisualizationOperator with PythonOperatorDescript "Visualize data in a Funnel Plot", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) private def createPlotlyFigure(): String = { @@ -95,6 +95,4 @@ class FunnelPlotOpDesc extends VisualizationOperator with PythonOperatorDescript finalcode } - - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala index 17a7e853b2f..d130a8db7cf 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode @JsonSchemaInject(json = """ { @@ -21,7 +21,7 @@ import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, Visuali } } """) -class GanttChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class GanttChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "start", required = true) @JsonSchemaTitle("Start Datetime Column") @@ -63,7 +63,7 @@ class GanttChartOpDesc extends VisualizationOperator with PythonOperatorDescript "A Gantt chart is a type of bar chart that illustrates a project schedule. The chart lists the tasks to be performed on the vertical axis, and time intervals on the horizontal axis. The width of the horizontal bars in the graph shows the duration of each activity.", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def manipulateTable(): String = { @@ -117,7 +117,4 @@ class GanttChartOpDesc extends VisualizationOperator with PythonOperatorDescript |""".stripMargin finalCode } - - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala index d6647c06188..8f5837affa3 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala @@ -6,9 +6,9 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} -class HeatMapOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class HeatMapOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "x", required = true) @JsonSchemaTitle("Value X Column") @@ -38,7 +38,7 @@ class HeatMapOpDesc extends VisualizationOperator with PythonOperatorDescriptor "Visualize data in a HeatMap Chart", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) private def createHeatMap(): String = { @@ -83,5 +83,4 @@ class HeatMapOpDesc extends VisualizationOperator with PythonOperatorDescriptor finalcode } - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala index 4422c226ffc..33d61e637e8 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala @@ -3,10 +3,11 @@ package edu.uci.ics.amber.operator.visualization.hierarchychart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} + import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.PythonOperatorDescriptor // type constraint: value can only be numeric @JsonSchemaInject(json = """ @@ -18,7 +19,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor } } """) -class HierarchyChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class HierarchyChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("Chart Type") @JsonPropertyDescription("treemap or sunburst") @@ -47,7 +48,7 @@ class HierarchyChartOpDesc extends VisualizationOperator with PythonOperatorDesc "Visualize data in hierarchy", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) private def getHierarchyAttributesInPython: String = @@ -108,6 +109,4 @@ class HierarchyChartOpDesc extends VisualizationOperator with PythonOperatorDesc finalCode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala index e0ca67aba7c..b1470a9e23e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala @@ -3,12 +3,12 @@ package edu.uci.ics.amber.operator.visualization.histogram import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -class HistogramChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class HistogramChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "value", required = true) @JsonSchemaTitle("Value Column") @JsonPropertyDescription("Column for counting values.") @@ -44,7 +44,7 @@ class HistogramChartOpDesc extends VisualizationOperator with PythonOperatorDesc "Visualize data in a Histogram Chart", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createPlotlyFigure(): String = { @@ -98,5 +98,4 @@ class HistogramChartOpDesc extends VisualizationOperator with PythonOperatorDesc Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() } - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala index bddd1f8545d..9ff91b96bdd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala @@ -5,23 +5,22 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.executor.OpExecInitInfo import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} /** * HTML Visualization operator to render any given HTML code * This is the description of the operator */ -class HtmlVizOpDesc extends VisualizationOperator { +class HtmlVizOpDesc extends LogicalOp { @JsonProperty(required = true) @JsonSchemaTitle("HTML content") @AutofillAttributeName var htmlContentAttrName: String = _ - override def chartType: String = VisualizationConstants.HTML_VIZ - override def getPhysicalOp( workflowId: WorkflowIdentity, executionId: ExecutionIdentity @@ -52,7 +51,7 @@ class HtmlVizOpDesc extends VisualizationOperator { "Render the result of HTML content", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) override def getOutputSchema(schemas: Array[Schema]): Schema = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala index 26ff459a064..742ede861cd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala @@ -5,13 +5,13 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import java.util import scala.jdk.CollectionConverters.ListHasAsScala -class LineChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class LineChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "yLabel", required = false, defaultValue = "Y Axis") @JsonSchemaTitle("Y Label") @@ -36,7 +36,7 @@ class LineChartOpDesc extends VisualizationOperator with PythonOperatorDescripto "View the result in line chart", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createPlotlyFigure(): String = { @@ -103,6 +103,4 @@ class LineChartOpDesc extends VisualizationOperator with PythonOperatorDescripto finalCode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala index e6e1154888f..133e6763781 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala @@ -4,10 +4,10 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} // type constraint: value can only be numeric @JsonSchemaInject(json = """ @@ -19,7 +19,7 @@ import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, Visuali } } """) -class PieChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class PieChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "value", required = true) @JsonSchemaTitle("Value Column") @@ -43,7 +43,7 @@ class PieChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor "Visualize data in a Pie Chart", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def manipulateTable(): String = { @@ -101,6 +101,4 @@ class PieChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor finalcode } - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala index 44266b828ba..8cff995fa83 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala @@ -4,10 +4,10 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} @JsonSchemaInject(json = """ { @@ -18,7 +18,7 @@ import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, Visuali } } """) -class QuiverPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class QuiverPlotOpDesc extends PythonOperatorDescriptor { //property panel variable: 4 requires: {x,y,u,v}, all columns should only contain numerical data @@ -52,7 +52,7 @@ class QuiverPlotOpDesc extends VisualizationOperator with PythonOperatorDescript "Visualize vector data in a Quiver Plot", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) //data cleaning for missing value @@ -117,5 +117,4 @@ class QuiverPlotOpDesc extends VisualizationOperator with PythonOperatorDescript finalCode } - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala index 2253487e2f4..3728bb55309 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala @@ -6,10 +6,10 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} -class SankeyDiagramOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class SankeyDiagramOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "Source Attribute", required = true) @JsonSchemaTitle("Source Attribute") @@ -39,7 +39,7 @@ class SankeyDiagramOpDesc extends VisualizationOperator with PythonOperatorDescr "Visualize data using a Sankey diagram", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createPlotlyFigure(): String = { @@ -104,7 +104,4 @@ class SankeyDiagramOpDesc extends VisualizationOperator with PythonOperatorDescr |""".stripMargin finalCode } - - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala index 78f9b540116..aaba3fff092 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} import edu.uci.ics.amber.operator.PythonOperatorDescriptor +import edu.uci.ics.amber.workflow.OutputPort.OutputMode @JsonSchemaInject(json = """ { "attributeTypeRules": { @@ -15,7 +15,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor } } """) -class Scatter3dChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class Scatter3dChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "x", required = true) @JsonSchemaTitle("X Column") @JsonPropertyDescription("Data column for the x-axis") @@ -44,7 +44,7 @@ class Scatter3dChartOpDesc extends VisualizationOperator with PythonOperatorDesc "Visualize data in a Scatter3D Plot", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) private def createPlotlyFigure(): String = { @@ -105,7 +105,4 @@ class Scatter3dChartOpDesc extends VisualizationOperator with PythonOperatorDesc |""".stripMargin finalcode } - - // make the chart type to html visualization so it can be recognized by both backend and frontend. - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala index 0d2d4ebf625..8f522388109 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode @JsonSchemaInject( json = @@ -22,7 +22,7 @@ import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, Visuali " }" + "}" ) -class ScatterplotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class ScatterplotOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("X-Column") @@ -60,8 +60,6 @@ class ScatterplotOpDesc extends VisualizationOperator with PythonOperatorDescrip @AutofillAttributeName var hoverName: String = "" - override def chartType: String = VisualizationConstants.HTML_VIZ - override def getOutputSchema(schemas: Array[Schema]): Schema = { Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() } @@ -72,7 +70,7 @@ class ScatterplotOpDesc extends VisualizationOperator with PythonOperatorDescrip "View the result in a scatterplot", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def manipulateTable(): String = { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala index 7161be833bf..648d4355b85 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala @@ -4,16 +4,14 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} -class TablesPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class TablesPlotOpDesc extends PythonOperatorDescriptor { @JsonPropertyDescription("List of columns to include in the table chart") @JsonProperty(value = "add attribute", required = true) var includedColumns: List[TablesConfig] = List() - override def chartType(): String = VisualizationConstants.HTML_VIZ - private def getAttributes: String = includedColumns.map(_.attributeName).mkString("'", "','", "'") @@ -78,7 +76,7 @@ class TablesPlotOpDesc extends VisualizationOperator with PythonOperatorDescript "Visualize data in a table chart.", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala index e25336dcd21..9f8059843c4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} /** @@ -16,7 +16,7 @@ import edu.uci.ics.amber.workflow.{InputPort, OutputPort} * The points can optionally be color coded using a data field. */ -class TernaryPlotOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class TernaryPlotOpDesc extends PythonOperatorDescriptor { // Add annotations for the first variable @JsonProperty(value = "firstVariable", required = true) @@ -47,9 +47,6 @@ class TernaryPlotOpDesc extends VisualizationOperator with PythonOperatorDescrip @JsonPropertyDescription("Specify the data field to color") @AutofillAttributeName var colorDataField: String = "" - // Register chart type as a visualization operator - override def chartType: String = VisualizationConstants.HTML_VIZ - // OperatorInfo instance describing ternary plot override def operatorInfo: OperatorInfo = OperatorInfo( @@ -57,7 +54,7 @@ class TernaryPlotOpDesc extends VisualizationOperator with PythonOperatorDescrip operatorDescription = "Points are graphed on a Ternary Plot using 3 specified data fields", operatorGroupName = OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) /** Returns the output schema set as html-content */ diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala index e4537d0345e..db9fa71c891 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala @@ -5,11 +5,12 @@ import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchema import edu.uci.ics.amber.core.executor.OpExecInitInfo import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode /** * URL Visualization operator to render any content in given URL link @@ -24,15 +25,13 @@ import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, Visuali } } """) -class UrlVizOpDesc extends VisualizationOperator { +class UrlVizOpDesc extends LogicalOp { @JsonProperty(required = true) @JsonSchemaTitle("URL content") @AutofillAttributeName private val urlContentAttrName: String = "" - override def chartType: String = VisualizationConstants.HTML_VIZ - override def getPhysicalOp( workflowId: WorkflowIdentity, executionId: ExecutionIdentity @@ -63,7 +62,7 @@ class UrlVizOpDesc extends VisualizationOperator { "Render the content of URL", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) override def getOutputSchema(schemas: Array[Schema]): Schema = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala similarity index 90% rename from core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallOpDesc.scala rename to core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala index 0074a6d46d9..236078d6e9b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala @@ -6,10 +6,10 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.operator.visualization.{VisualizationConstants, VisualizationOperator} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} -class WaterfallChartOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class WaterfallChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "xColumn", required = true) @JsonSchemaTitle("X Axis Values") @@ -33,7 +33,7 @@ class WaterfallChartOpDesc extends VisualizationOperator with PythonOperatorDesc "Visualize data as a waterfall chart", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def createPlotlyFigure(): String = { @@ -83,6 +83,4 @@ class WaterfallChartOpDesc extends VisualizationOperator with PythonOperatorDesc finalCode } - // Specify the chart type as HTML visualization - override def chartType(): String = VisualizationConstants.HTML_VIZ } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala index 0d34cd39268..9659504c5fe 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala @@ -10,13 +10,10 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.operator.visualization.{ - ImageUtility, - VisualizationConstants, - VisualizationOperator -} +import edu.uci.ics.amber.operator.visualization.ImageUtility +import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.{InputPort, OutputPort} -class WordCloudOpDesc extends VisualizationOperator with PythonOperatorDescriptor { +class WordCloudOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("Text column") @AutofillAttributeName @@ -37,7 +34,7 @@ class WordCloudOpDesc extends VisualizationOperator with PythonOperatorDescripto "Generate word cloud for result texts", OperatorGroupConstants.VISUALIZATION_GROUP, inputPorts = List(InputPort()), - outputPorts = List(OutputPort()) + outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) def manipulateTable(): String = { @@ -92,6 +89,4 @@ class WordCloudOpDesc extends VisualizationOperator with PythonOperatorDescripto print(finalCode) finalCode } - - override def chartType(): String = VisualizationConstants.HTML_VIZ } From f90362e0cc8ab3ce5106432937405aaf4b0ac1f5 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Fri, 20 Dec 2024 13:29:29 -0800 Subject: [PATCH 11/47] Remove sink desc (#3170) This PR refactors the handling of sink operators in Texera by removing the sink descriptor and introducing a streamlined approach to creating physical sink operators during compilation and scheduling. Additionally, it shifts the storage assignment logic from the logical layer to the physical layer. 1. **Sink Descriptor Removal:** Removed the sink descriptor, physical sink operators are no longer created through descriptors. In the future, we will remove physical sink operators. 2. **Sink Operator Creation:** - Introduced a temporary factory for creating physical sink operators without relying on a descriptor. - Physical sink operators are now considered part of the sub-plan of their upstream logical operator. For example: If the HashJoin logical operator requires a sink, its physical sub-plan includes the building physicalOp, probing physicalOp, and the sink physicalOp. 3. **Storage Assignment Refactor:** - Merged the storage assignment logic into the physical layer, removing it from the logical layer. - When a physical sink operator is created (either during compilation or scheduling), its associated storage is also created at the same moment. --- .../scheduling/ScheduleGenerator.scala | 55 ++--- .../web/service/ExecutionResultService.scala | 127 +++++----- .../service/WorkflowExecutionService.scala | 2 +- .../texera/web/service/WorkflowService.scala | 2 +- .../workflow/SinkInjectionTransformer.scala | 71 ------ .../texera/workflow/WorkflowCompiler.scala | 222 +++++++----------- .../CostBasedScheduleGeneratorSpec.scala | 24 +- ...ExpansionGreedyScheduleGeneratorSpec.scala | 57 +---- .../engine/e2e/BatchSizePropagationSpec.scala | 60 +---- .../amber/engine/e2e/DataProcessingSpec.scala | 171 +++----------- .../uci/ics/amber/engine/e2e/PauseSpec.scala | 29 +-- .../faulttolerance/CheckpointSpec.scala | 9 +- .../workflow/SchemaPropagationSpec.scala | 3 +- .../ics/amber/compiler/WorkflowCompiler.scala | 15 +- .../util/SinkInjectionTransformer.scala | 73 ------ .../core/storage/result/OpResultStorage.scala | 6 + .../uci/ics/amber/operator/LogicalOp.scala | 2 - .../operator/SpecialPhysicalOpFactory.scala | 71 ++++++ .../ics/amber/operator/TestOperators.scala | 5 - .../ics/amber/operator/sink/SinkOpDesc.scala | 5 - .../sink/managed/ProgressiveSinkOpDesc.java | 158 ------------- .../sink/managed/ProgressiveSinkOpExec.scala | 1 + 22 files changed, 329 insertions(+), 839 deletions(-) delete mode 100644 core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/util/SinkInjectionTransformer.scala create mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/SinkOpDesc.scala delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpDesc.java diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala index 800d8344fd3..121be2289b1 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala @@ -2,7 +2,6 @@ package edu.uci.ics.amber.engine.architecture.scheduling import edu.uci.ics.amber.core.executor.OpExecInitInfo import edu.uci.ics.amber.core.storage.result.{OpResultStorage, ResultStorage} -import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{ PhysicalOp, PhysicalPlan, @@ -14,9 +13,9 @@ import edu.uci.ics.amber.engine.architecture.scheduling.resourcePolicies.{ DefaultResourceAllocator, ExecutionClusterInfo } -import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc +import edu.uci.ics.amber.operator.SpecialPhysicalOpFactory import edu.uci.ics.amber.operator.source.cache.CacheSourceOpExec -import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, PhysicalOpIdentity, WorkflowIdentity} +import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, PhysicalOpIdentity} import edu.uci.ics.amber.workflow.{OutputPort, PhysicalLink} import org.jgrapht.graph.DirectedAcyclicGraph import org.jgrapht.traverse.TopologicalOrderIterator @@ -159,9 +158,7 @@ abstract class ScheduleGenerator( .removeLink(physicalLink) // create cache writer and link - val matWriterInputSchema = fromOp.outputPorts(fromPortId)._3.toOption.get - val matWriterPhysicalOp: PhysicalOp = - createMatWriter(physicalLink, Array(matWriterInputSchema)) + val matWriterPhysicalOp: PhysicalOp = createMatWriter(physicalLink) val sourceToWriterLink = PhysicalLink( fromOp.id, @@ -173,6 +170,21 @@ abstract class ScheduleGenerator( .addOperator(matWriterPhysicalOp) .addLink(sourceToWriterLink) + // expect exactly one input port and one output port + val schema = newPhysicalPlan + .getOperator(matWriterPhysicalOp.id) + .outputPorts(matWriterPhysicalOp.outputPorts.keys.head) + ._3 + .toOption + .get + ResultStorage + .getOpResultStorage(workflowContext.workflowId) + .create( + key = matWriterPhysicalOp.id.logicalOpId, + mode = OpResultStorage.defaultStorageMode, + schema = Some(schema) + ) + // create cache reader and link val matReaderPhysicalOp: PhysicalOp = createMatReader(matWriterPhysicalOp.id.logicalOpId, physicalLink) @@ -219,31 +231,16 @@ abstract class ScheduleGenerator( } - private def createMatWriter( - physicalLink: PhysicalLink, - inputSchema: Array[Schema] - ): PhysicalOp = { - val matWriter = new ProgressiveSinkOpDesc() - matWriter.setContext(workflowContext) - matWriter.setOperatorId(s"materialized_${getMatIdFromPhysicalLink(physicalLink)}") - // expect exactly one input port and one output port - val schema = matWriter.getOutputSchema(inputSchema) - ResultStorage - .getOpResultStorage(workflowContext.workflowId) - .create( - key = matWriter.operatorIdentifier, - mode = OpResultStorage.defaultStorageMode, - schema = Some(schema) - ) - matWriter.setUpstreamId( - matWriter.operatorIdentifier - ) - - matWriter.getPhysicalOp( + private def createMatWriter(physicalLink: PhysicalLink): PhysicalOp = { + val outputMode = + physicalPlan.getOperator(physicalLink.fromOpId).outputPorts(physicalLink.fromPortId)._1.mode + val storageKey = s"materialized_${getMatIdFromPhysicalLink(physicalLink)}" + SpecialPhysicalOpFactory.newSinkPhysicalOp( workflowContext.workflowId, - workflowContext.executionId + workflowContext.executionId, + storageKey, + outputMode ) - } private def getMatIdFromPhysicalLink(physicalLink: PhysicalLink) = diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala index 1c091600db7..a0b09ff3fd1 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala @@ -12,6 +12,7 @@ import edu.uci.ics.amber.core.storage.result.{ WorkflowResultStore } import edu.uci.ics.amber.core.tuple.Tuple +import edu.uci.ics.amber.core.workflow.{PhysicalOp, PhysicalPlan} import edu.uci.ics.amber.engine.architecture.controller.{ExecutionStateUpdate, FatalError} import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregatedState.{ COMPLETED, @@ -22,8 +23,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregat import edu.uci.ics.amber.engine.common.client.AmberClient import edu.uci.ics.amber.engine.common.executionruntimestate.ExecutionMetadataStore import edu.uci.ics.amber.engine.common.{AmberConfig, AmberRuntime} -import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc -import edu.uci.ics.amber.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, WorkflowIdentity} import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.texera.web.SubscriptionManager import edu.uci.ics.texera.web.model.websocket.event.{ @@ -33,7 +33,6 @@ import edu.uci.ics.texera.web.model.websocket.event.{ } import edu.uci.ics.texera.web.model.websocket.request.ResultPaginationRequest import edu.uci.ics.texera.web.storage.{ExecutionStateStore, WorkflowStateStore} -import edu.uci.ics.texera.workflow.LogicalPlan import java.util.UUID import scala.collection.mutable @@ -63,12 +62,23 @@ object ExecutionResultService { * Produces the WebResultUpdate to send to frontend from a result update from the engine. */ private def convertWebResultUpdate( - sink: ProgressiveSinkOpDesc, + workflowIdentity: WorkflowIdentity, + physicalOps: List[PhysicalOp], oldTupleCount: Int, newTupleCount: Int ): WebResultUpdate = { + val outputMode = physicalOps + .flatMap(op => op.outputPorts) + .filter({ + case (portId, (port, links, schema)) => !portId.internal + }) + .map({ + case (portId, (port, links, schema)) => port.mode + }) + .head + val webOutputMode: WebOutputMode = { - sink.getOutputMode match { + outputMode match { // currently, only table outputs are using these modes case OutputMode.SET_DELTA => SetDeltaMode() case OutputMode.SET_SNAPSHOT => PaginationMode() @@ -79,7 +89,7 @@ object ExecutionResultService { } val storage = - ResultStorage.getOpResultStorage(sink.getContext.workflowId).get(sink.getUpstreamId.get) + ResultStorage.getOpResultStorage(workflowIdentity).get(physicalOps.head.id.logicalOpId) val webUpdate = webOutputMode match { case PaginationMode() => val numTuples = storage.getCount @@ -98,7 +108,7 @@ object ExecutionResultService { case _ => throw new RuntimeException( - "update mode combination not supported: " + (webOutputMode, sink.getOutputMode) + "update mode combination not supported: " + (webOutputMode, outputMode) ) } webUpdate @@ -150,18 +160,16 @@ object ExecutionResultService { * - send result update event to the frontend */ class ExecutionResultService( + workflowIdentity: WorkflowIdentity, val workflowStateStore: WorkflowStateStore ) extends SubscriptionManager with LazyLogging { - - var sinkOperators: mutable.HashMap[OperatorIdentity, ProgressiveSinkOpDesc] = - mutable.HashMap[OperatorIdentity, ProgressiveSinkOpDesc]() private val resultPullingFrequency = AmberConfig.executionResultPollingInSecs private var resultUpdateCancellable: Cancellable = _ def attachToExecution( stateStore: ExecutionStateStore, - logicalPlan: LogicalPlan, + physicalPlan: PhysicalPlan, client: AmberClient ): Unit = { @@ -181,7 +189,7 @@ class ExecutionResultService( 2.seconds, resultPullingFrequency.seconds ) { - onResultUpdate() + onResultUpdate(physicalPlan) } } } else { @@ -197,7 +205,7 @@ class ExecutionResultService( logger.info("Workflow execution terminated. Stop update results.") if (resultUpdateCancellable.cancel() || resultUpdateCancellable.isCancelled) { // immediately perform final update - onResultUpdate() + onResultUpdate(physicalPlan) } } }) @@ -225,18 +233,15 @@ class ExecutionResultService( case (opId, info) => val oldInfo = oldState.resultInfo.getOrElse(opId, OperatorResultMetadata()) buf(opId.id) = ExecutionResultService.convertWebResultUpdate( - sinkOperators(opId), + workflowIdentity, + physicalPlan.getPhysicalOpsOfLogicalOp(opId), oldInfo.tupleCount, info.tupleCount ) - if ( - StorageConfig.resultStorageMode.toLowerCase == "mongodb" - && !opId.id.startsWith("sink") - ) { - val sinkOp = sinkOperators(opId) + if (StorageConfig.resultStorageMode.toLowerCase == "mongodb") { val opStorage = ResultStorage - .getOpResultStorage(sinkOp.getContext.workflowId) - .get(sinkOp.getUpstreamId.get) + .getOpResultStorage(workflowIdentity) + .get(physicalPlan.getPhysicalOpsOfLogicalOp(opId).head.id.logicalOpId) opStorage match { case mongoDocument: MongoDocument[Tuple] => val tableCatStats = mongoDocument.getCategoricalStats @@ -262,22 +267,11 @@ class ExecutionResultService( }) ) - // first clear all the results - sinkOperators.clear() + // clear all the result metadata workflowStateStore.resultStore.updateState { _ => WorkflowResultStore() // empty result store } - // For operators connected to a sink and sinks, - // create result service so that the results can be displayed. - logicalPlan.getTerminalOperatorIds.map(sink => { - logicalPlan.getOperator(sink) match { - case sinkOp: ProgressiveSinkOpDesc => - sinkOperators += ((sinkOp.getUpstreamId.get, sinkOp)) - sinkOperators += ((sink, sinkOp)) - case other => // skip other non-texera-managed sinks, if any - } - }) } def handleResultPagination(request: ResultPaginationRequest): TexeraWebSocketEvent = { @@ -286,16 +280,12 @@ class ExecutionResultService( val opId = OperatorIdentity(request.operatorID) val paginationIterable = { - if (sinkOperators.contains(opId)) { - val sinkOp = sinkOperators(opId) - ResultStorage - .getOpResultStorage(sinkOp.getContext.workflowId) - .get(sinkOp.getUpstreamId.get) - .getRange(from, from + request.pageSize) - .to(Iterable) - } else { - Iterable.empty - } + ResultStorage + .getOpResultStorage(workflowIdentity) + .get(opId) + .getRange(from, from + request.pageSize) + .to(Iterable) + } val mappedResults = paginationIterable .map(tuple => tuple.asKeyValuePairJson()) @@ -306,23 +296,42 @@ class ExecutionResultService( PaginatedResultEvent.apply(request, mappedResults, attributes) } - private def onResultUpdate(): Unit = { + private def onResultUpdate(physicalPlan: PhysicalPlan): Unit = { workflowStateStore.resultStore.updateState { _ => - val newInfo: Map[OperatorIdentity, OperatorResultMetadata] = sinkOperators.map { - - case (id, sink) => - val count = ResultStorage - .getOpResultStorage(sink.getContext.workflowId) - .get(sink.getUpstreamId.get) - .getCount - .toInt - val mode = sink.getOutputMode - val changeDetector = - if (mode == OutputMode.SET_SNAPSHOT) { - UUID.randomUUID.toString - } else "" - (id, OperatorResultMetadata(count, changeDetector)) - }.toMap + val newInfo: Map[OperatorIdentity, OperatorResultMetadata] = { + ResultStorage + .getOpResultStorage(workflowIdentity) + .getAllKeys + .filter(!_.id.startsWith("materialized_")) + .map(storageKey => { + val count = ResultStorage + .getOpResultStorage(workflowIdentity) + .get(storageKey) + .getCount + .toInt + + val opId = storageKey + + // use the first output port's mode + val mode = physicalPlan + .getPhysicalOpsOfLogicalOp(opId) + .flatMap(physicalOp => physicalOp.outputPorts) + .filter({ + case (portId, (port, links, schema)) => + !portId.internal + }) + .map({ + case (portId, (port, links, schema)) => port.mode + }) + .head + val changeDetector = + if (mode == OutputMode.SET_SNAPSHOT) { + UUID.randomUUID.toString + } else "" + (opId, OperatorResultMetadata(count, changeDetector)) + }) + .toMap + } WorkflowResultStore(newInfo) } } diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowExecutionService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowExecutionService.scala index 26e5899c051..09015a0a00c 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowExecutionService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowExecutionService.scala @@ -114,7 +114,7 @@ class WorkflowExecutionService( executionConsoleService = new ExecutionConsoleService(client, executionStateStore, wsInput) logger.info("Starting the workflow execution.") - resultService.attachToExecution(executionStateStore, workflow.logicalPlan, client) + resultService.attachToExecution(executionStateStore, workflow.physicalPlan, client) executionStateStore.metadataStore.updateState(metadataStore => updateWorkflowState(READY, metadataStore) .withFatalErrors(Seq.empty) diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala index a4c3c0c0687..6f0b7868e54 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala @@ -77,7 +77,7 @@ class WorkflowService( val stateStore = new WorkflowStateStore() var executionService: BehaviorSubject[WorkflowExecutionService] = BehaviorSubject.create() - val resultService: ExecutionResultService = new ExecutionResultService(stateStore) + val resultService: ExecutionResultService = new ExecutionResultService(workflowId, stateStore) val exportService: ResultExportService = new ResultExportService(workflowId) val lifeCycleManager: WorkflowLifecycleManager = new WorkflowLifecycleManager( s"workflowId=$workflowId", diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala index 80f0c3e290f..8b137891791 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala @@ -1,72 +1 @@ -package edu.uci.ics.texera.workflow -import edu.uci.ics.amber.operator.sink.SinkOpDesc -import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PortIdentity - -object SinkInjectionTransformer { - - def transform(opsToViewResult: List[String], oldPlan: LogicalPlan): LogicalPlan = { - var logicalPlan = oldPlan - - // for any terminal operator without a sink, add a sink - val nonSinkTerminalOps = logicalPlan.getTerminalOperatorIds.filter(opId => - !logicalPlan.getOperator(opId).isInstanceOf[SinkOpDesc] - ) - // for any operators marked as view result without a sink, add a sink - val viewResultOps = opsToViewResult - .map(idString => OperatorIdentity(idString)) - .filter(opId => !logicalPlan.getDownstreamOps(opId).exists(op => op.isInstanceOf[SinkOpDesc])) - - val operatorsToAddSink = (nonSinkTerminalOps ++ viewResultOps).toSet - operatorsToAddSink.foreach(opId => { - val op = logicalPlan.getOperator(opId) - op.operatorInfo.outputPorts.foreach(outPort => { - val sink = new ProgressiveSinkOpDesc() - sink.setOperatorId("sink_" + opId.id) - logicalPlan = logicalPlan - .addOperator(sink) - .addLink( - op.operatorIdentifier, - outPort.id, - sink.operatorIdentifier, - toPortId = PortIdentity() - ) - }) - }) - - // check precondition: all the terminal operators should be sinks - assert( - logicalPlan.getTerminalOperatorIds.forall(o => - logicalPlan.getOperator(o).isInstanceOf[SinkOpDesc] - ) - ) - - // for each sink: - // set the corresponding upstream ID and port - // set output mode based on the visualization operator before it - logicalPlan.getTerminalOperatorIds.foreach(sinkOpId => { - val sinkOp = logicalPlan.getOperator(sinkOpId).asInstanceOf[ProgressiveSinkOpDesc] - val upstream = logicalPlan.getUpstreamOps(sinkOpId).headOption - val edge = logicalPlan.links.find(l => - l.fromOpId == upstream.map(_.operatorIdentifier).orNull - && l.toOpId == sinkOpId - ) - assert(upstream.nonEmpty) - if (upstream.nonEmpty && edge.nonEmpty) { - // set upstream ID and port - sinkOp.setUpstreamId(upstream.get.operatorIdentifier) - sinkOp.setUpstreamPort(edge.get.fromPortId.id) - - // set output mode for visualization operator - val outputPort = - upstream.get.operatorInfo.outputPorts.find(port => port.id == edge.get.fromPortId).get - sinkOp.setOutputMode(outputPort.mode) - } - }) - - logicalPlan - } - -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala index 9dd0bdc14f7..9144fa63d48 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala @@ -3,15 +3,13 @@ package edu.uci.ics.texera.workflow import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.storage.result.{OpResultStorage, ResultStorage} import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.PhysicalOp.getExternalPortSchemas import edu.uci.ics.amber.core.workflow.{PhysicalPlan, WorkflowContext} import edu.uci.ics.amber.engine.architecture.controller.Workflow import edu.uci.ics.amber.engine.common.Utils.objectMapper -import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc +import edu.uci.ics.amber.operator.SpecialPhysicalOpFactory import edu.uci.ics.amber.virtualidentity.OperatorIdentity import edu.uci.ics.amber.workflow.OutputPort.OutputMode.SINGLE_SNAPSHOT -import edu.uci.ics.amber.workflow.PhysicalLink -import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError +import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} import edu.uci.ics.texera.web.model.websocket.request.LogicalPlanPojo import edu.uci.ics.texera.web.service.ExecutionsMetadataPersistService @@ -19,78 +17,6 @@ import scala.collection.mutable.ArrayBuffer import scala.jdk.CollectionConverters.IteratorHasAsScala import scala.util.{Failure, Success, Try} -object WorkflowCompiler { - // util function for extracting the error causes - private def getStackTraceWithAllCauses(err: Throwable, topLevel: Boolean = true): String = { - val header = if (topLevel) { - "Stack trace for developers: \n\n" - } else { - "\n\nCaused by:\n" - } - val message = header + err.toString + "\n" + err.getStackTrace.mkString("\n") - if (err.getCause != null) { - message + getStackTraceWithAllCauses(err.getCause, topLevel = false) - } else { - message - } - } - - // util function for convert the error list to error map, and report the error in log - private def collectInputSchemasOfSinks( - physicalPlan: PhysicalPlan, - errorList: ArrayBuffer[(OperatorIdentity, Throwable)] // Mandatory error list - ): Map[OperatorIdentity, Array[Schema]] = { - physicalPlan.operators - .filter(op => op.isSinkOperator) - .map { physicalOp => - physicalOp.id.logicalOpId -> getExternalPortSchemas( - physicalOp, - fromInput = true, - Some(errorList) - ).flatten.toArray - } - .toMap - } - - // Only collects the input schemas for the sink operator - private def collectInputSchemaFromPhysicalPlanForSink( - physicalPlan: PhysicalPlan, - errorList: ArrayBuffer[(OperatorIdentity, Throwable)] // Mandatory error list - ): Map[OperatorIdentity, List[Option[Schema]]] = { - val physicalInputSchemas = - physicalPlan.operators.filter(op => op.isSinkOperator).map { physicalOp => - // Process inputPorts and capture Throwable values in the errorList - physicalOp.id -> physicalOp.inputPorts.values - .filterNot(_._1.id.internal) - .map { - case (port, _, schema) => - schema match { - case Left(err) => - // Save the Throwable into the errorList - errorList.append((physicalOp.id.logicalOpId, err)) - port.id -> None // Use None for this port - case Right(validSchema) => - port.id -> Some(validSchema) // Use the valid schema - } - } - .toList // Convert to a list for further processing - } - - // Group the physical input schemas by their logical operator ID and consolidate the schemas - physicalInputSchemas - .groupBy(_._1.logicalOpId) - .view - .mapValues(_.flatMap(_._2).toList.sortBy(_._1.id).map(_._2)) - .toMap - } -} - -case class WorkflowCompilationResult( - physicalPlan: Option[PhysicalPlan], // if physical plan is none, the compilation is failed - operatorIdToInputSchemas: Map[OperatorIdentity, List[Option[Schema]]], - operatorIdToError: Map[OperatorIdentity, WorkflowFatalError] -) - class WorkflowCompiler( context: WorkflowContext ) extends LazyLogging { @@ -98,9 +24,15 @@ class WorkflowCompiler( // function to expand logical plan to physical plan private def expandLogicalPlan( logicalPlan: LogicalPlan, + logicalOpsToViewResult: List[String], errorList: Option[ArrayBuffer[(OperatorIdentity, Throwable)]] ): PhysicalPlan = { + val terminalLogicalOps = logicalPlan.getTerminalOperatorIds + val toAddSink = (terminalLogicalOps ++ logicalOpsToViewResult).toSet var physicalPlan = PhysicalPlan(operators = Set.empty, links = Set.empty) + // create a JSON object that holds pointers to the workflow's results in Mongo + val resultsJSON = objectMapper.createObjectNode() + val sinksPointers = objectMapper.createArrayNode() logicalPlan.getTopologicalOpIds.asScala.foreach(logicalOpId => Try { @@ -135,8 +67,69 @@ class WorkflowCompiler( .foldLeft(physicalPlan) { (plan, link) => plan.addLink(link) } } }) + + // assign the sinks to toAddSink operators' external output ports + subPlan + .topologicalIterator() + .map(subPlan.getOperator) + .flatMap { physicalOp => + physicalOp.outputPorts.map(outputPort => (physicalOp, outputPort)) + } + .filter({ + case (physicalOp, (_, (outputPort, _, _))) => + toAddSink.contains(physicalOp.id.logicalOpId) && !outputPort.id.internal + }) + .foreach({ + case (physicalOp, (_, (outputPort, _, schema))) => + val storage = ResultStorage.getOpResultStorage(context.workflowId) + val storageKey = physicalOp.id.logicalOpId + + // due to the size limit of single document in mongoDB (16MB) + // for sinks visualizing HTMLs which could possibly be large in size, we always use the memory storage. + val storageType = { + if (outputPort.mode == SINGLE_SNAPSHOT) OpResultStorage.MEMORY + else OpResultStorage.defaultStorageMode + } + if (!storage.contains(storageKey)) { + // get the schema for result storage in certain mode + val sinkStorageSchema: Option[Schema] = + if (storageType == OpResultStorage.MONGODB) { + // use the output schema on the first output port as the schema for storage + Some(schema.right.get) + } else { + None + } + storage.create( + s"${context.executionId}_", + storageKey, + storageType, + sinkStorageSchema + ) + // add the sink collection name to the JSON array of sinks + val storageNode = objectMapper.createObjectNode() + storageNode.put("storageType", storageType) + storageNode.put("storageKey", s"${context.executionId}_$storageKey") + sinksPointers.add(storageNode) + } + + val sinkPhysicalOp = SpecialPhysicalOpFactory.newSinkPhysicalOp( + context.workflowId, + context.executionId, + storageKey.id, + outputPort.mode + ) + val sinkLink = + PhysicalLink( + physicalOp.id, + outputPort.id, + sinkPhysicalOp.id, + PortIdentity(internal = true) + ) + physicalPlan = physicalPlan.addOperator(sinkPhysicalOp).addLink(sinkLink) + }) } match { case Success(_) => + case Failure(err) => errorList match { case Some(list) => list.append((logicalOpId, err)) @@ -144,6 +137,12 @@ class WorkflowCompiler( } } ) + + // update execution entry in MySQL to have pointers to the mongo collections + resultsJSON.set("results", sinksPointers) + ExecutionsMetadataPersistService.tryUpdateExistingExecution(context.executionId) { + _.setResult(resultsJSON.toString) + } physicalPlan } @@ -161,80 +160,17 @@ class WorkflowCompiler( logicalPlanPojo: LogicalPlanPojo ): Workflow = { // 1. convert the pojo to logical plan - var logicalPlan: LogicalPlan = LogicalPlan(logicalPlanPojo) + val logicalPlan: LogicalPlan = LogicalPlan(logicalPlanPojo) - // 2. Manipulate logical plan by: - // - inject sink - logicalPlan = SinkInjectionTransformer.transform( - logicalPlanPojo.opsToViewResult, - logicalPlan - ) - // - resolve the file name in each scan source operator + // 2. resolve the file name in each scan source operator logicalPlan.resolveScanSourceOpFileName(None) // 3. Propagate the schema to get the input & output schemas for each port of each operator logicalPlan.propagateWorkflowSchema(context, None) - // 4. assign the sink storage using logical plan and expand the logical plan to the physical plan, - assignSinkStorage(logicalPlan, context) - val physicalPlan = expandLogicalPlan(logicalPlan, None) + // 4. expand the logical plan to the physical plan, and assign storage + val physicalPlan = expandLogicalPlan(logicalPlan, logicalPlanPojo.opsToViewResult, None) Workflow(context, logicalPlan, physicalPlan) } - - /** - * Once standalone compiler is done, move this function to the execution service, and change the 1st parameter from LogicalPlan to PhysicalPlan - */ - @Deprecated - def assignSinkStorage( - logicalPlan: LogicalPlan, - context: WorkflowContext, - reuseStorageSet: Set[OperatorIdentity] = Set() - ): Unit = { - val storage = ResultStorage.getOpResultStorage(context.workflowId) - // create a JSON object that holds pointers to the workflow's results in Mongo - val resultsJSON = objectMapper.createObjectNode() - val sinksPointers = objectMapper.createArrayNode() - // assign storage to texera-managed sinks before generating exec config - logicalPlan.operators.foreach { - case o @ (sink: ProgressiveSinkOpDesc) => - val storageKey = sink.getUpstreamId.getOrElse(o.operatorIdentifier) - // due to the size limit of single document in mongoDB (16MB) - // for sinks visualizing HTMLs which could possibly be large in size, we always use the memory storage. - val storageType = { - if (sink.getOutputMode == SINGLE_SNAPSHOT) OpResultStorage.MEMORY - else OpResultStorage.defaultStorageMode - } - if (!reuseStorageSet.contains(storageKey) || !storage.contains(storageKey)) { - // get the schema for result storage in certain mode - val sinkStorageSchema: Option[Schema] = - if (storageType == OpResultStorage.MONGODB) { - // use the output schema on the first output port as the schema for storage - Some(o.outputPortToSchemaMapping.head._2) - } else { - None - } - storage.create( - s"${o.getContext.executionId}_", - storageKey, - storageType, - sinkStorageSchema - ) - // add the sink collection name to the JSON array of sinks - val storageNode = objectMapper.createObjectNode() - storageNode.put("storageType", storageType) - storageNode.put("storageKey", s"${o.getContext.executionId}_$storageKey") - sinksPointers.add(storageNode) - } - storage.get(storageKey) - - case _ => - } - // update execution entry in MySQL to have pointers to the mongo collections - resultsJSON.set("results", sinksPointers) - ExecutionsMetadataPersistService.tryUpdateExistingExecution(context.executionId) { - _.setResult(resultsJSON.toString) - } - } - } diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGeneratorSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGeneratorSpec.scala index be180b7cba3..d67524555e7 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGeneratorSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGeneratorSpec.scala @@ -11,17 +11,15 @@ import org.scalatest.flatspec.AnyFlatSpec class CostBasedScheduleGeneratorSpec extends AnyFlatSpec with MockFactory { - "CostBasedRegionPlanGenerator" should "finish bottom-up search using different pruning techniques with correct number of states explored in csv->->filter->join->sink workflow" in { + "CostBasedRegionPlanGenerator" should "finish bottom-up search using different pruning techniques with correct number of states explored in csv->->filter->join workflow" in { val headerlessCsvOpDesc1 = TestOperators.headerlessSmallCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("column-1", "Asia") val joinOpDesc = TestOperators.joinOpDesc("column-1", "column-1") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( List( headerlessCsvOpDesc1, keywordOpDesc, - joinOpDesc, - sink + joinOpDesc ), List( LogicalLink( @@ -41,12 +39,6 @@ class CostBasedScheduleGeneratorSpec extends AnyFlatSpec with MockFactory { PortIdentity(), joinOpDesc.operatorIdentifier, PortIdentity(1) - ), - LogicalLink( - joinOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), new WorkflowContext() @@ -106,17 +98,15 @@ class CostBasedScheduleGeneratorSpec extends AnyFlatSpec with MockFactory { } - "CostBasedRegionPlanGenerator" should "finish top-down search using different pruning techniques with correct number of states explored in csv->->filter->join->sink workflow" in { + "CostBasedRegionPlanGenerator" should "finish top-down search using different pruning techniques with correct number of states explored in csv->->filter->join workflow" in { val headerlessCsvOpDesc1 = TestOperators.headerlessSmallCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("column-1", "Asia") val joinOpDesc = TestOperators.joinOpDesc("column-1", "column-1") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( List( headerlessCsvOpDesc1, keywordOpDesc, - joinOpDesc, - sink + joinOpDesc ), List( LogicalLink( @@ -136,12 +126,6 @@ class CostBasedScheduleGeneratorSpec extends AnyFlatSpec with MockFactory { PortIdentity(), joinOpDesc.operatorIdentifier, PortIdentity(1) - ), - LogicalLink( - joinOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), new WorkflowContext() diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGeneratorSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGeneratorSpec.scala index ea8c3c96a51..c28c3265a20 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGeneratorSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGeneratorSpec.scala @@ -13,24 +13,17 @@ import org.scalatest.flatspec.AnyFlatSpec class ExpansionGreedyScheduleGeneratorSpec extends AnyFlatSpec with MockFactory { - "RegionPlanGenerator" should "correctly find regions in headerlessCsv->keyword->sink workflow" in { + "RegionPlanGenerator" should "correctly find regions in headerlessCsv->keyword workflow" in { val headerlessCsvOpDesc = TestOperators.headerlessSmallCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("column-1", "Asia") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(headerlessCsvOpDesc, keywordOpDesc, sink), + List(headerlessCsvOpDesc, keywordOpDesc), List( LogicalLink( headerlessCsvOpDesc.operatorIdentifier, PortIdentity(0), keywordOpDesc.operatorIdentifier, PortIdentity(0) - ), - LogicalLink( - keywordOpDesc.operatorIdentifier, - PortIdentity(0), - sink.operatorIdentifier, - PortIdentity(0) ) ), new WorkflowContext() @@ -61,17 +54,15 @@ class ExpansionGreedyScheduleGeneratorSpec extends AnyFlatSpec with MockFactory } } - "RegionPlanGenerator" should "correctly find regions in csv->(csv->)->join->sink workflow" in { + "RegionPlanGenerator" should "correctly find regions in csv->(csv->)->join workflow" in { val headerlessCsvOpDesc1 = TestOperators.headerlessSmallCsvScanOpDesc() val headerlessCsvOpDesc2 = TestOperators.headerlessSmallCsvScanOpDesc() val joinOpDesc = TestOperators.joinOpDesc("column-1", "column-1") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( List( headerlessCsvOpDesc1, headerlessCsvOpDesc2, - joinOpDesc, - sink + joinOpDesc ), List( LogicalLink( @@ -85,12 +76,6 @@ class ExpansionGreedyScheduleGeneratorSpec extends AnyFlatSpec with MockFactory PortIdentity(), joinOpDesc.operatorIdentifier, PortIdentity(1) - ), - LogicalLink( - joinOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), new WorkflowContext() @@ -140,17 +125,15 @@ class ExpansionGreedyScheduleGeneratorSpec extends AnyFlatSpec with MockFactory } - "RegionPlanGenerator" should "correctly find regions in csv->->filter->join->sink workflow" in { + "RegionPlanGenerator" should "correctly find regions in csv->->filter->join workflow" in { val headerlessCsvOpDesc1 = TestOperators.headerlessSmallCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("column-1", "Asia") val joinOpDesc = TestOperators.joinOpDesc("column-1", "column-1") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( List( headerlessCsvOpDesc1, keywordOpDesc, - joinOpDesc, - sink + joinOpDesc ), List( LogicalLink( @@ -170,12 +153,6 @@ class ExpansionGreedyScheduleGeneratorSpec extends AnyFlatSpec with MockFactory PortIdentity(), joinOpDesc.operatorIdentifier, PortIdentity(1) - ), - LogicalLink( - joinOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), new WorkflowContext() @@ -206,19 +183,17 @@ class ExpansionGreedyScheduleGeneratorSpec extends AnyFlatSpec with MockFactory } } // - "RegionPlanGenerator" should "correctly find regions in buildcsv->probecsv->hashjoin->hashjoin->sink workflow" in { + "RegionPlanGenerator" should "correctly find regions in buildcsv->probecsv->hashjoin->hashjoin workflow" in { val buildCsv = TestOperators.headerlessSmallCsvScanOpDesc() val probeCsv = TestOperators.smallCsvScanOpDesc() val hashJoin1 = TestOperators.joinOpDesc("column-1", "Region") val hashJoin2 = TestOperators.joinOpDesc("column-2", "Country") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( List( buildCsv, probeCsv, hashJoin1, - hashJoin2, - sink + hashJoin2 ), List( LogicalLink( @@ -244,12 +219,6 @@ class ExpansionGreedyScheduleGeneratorSpec extends AnyFlatSpec with MockFactory PortIdentity(), hashJoin2.operatorIdentifier, PortIdentity(1) - ), - LogicalLink( - hashJoin2.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), new WorkflowContext() @@ -284,14 +253,12 @@ class ExpansionGreedyScheduleGeneratorSpec extends AnyFlatSpec with MockFactory val split = new SplitOpDesc() val training = new PythonUDFOpDescV2() val inference = new DualInputPortsPythonUDFOpDescV2() - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( List( csv, split, training, - inference, - sink + inference ), List( LogicalLink( @@ -317,12 +284,6 @@ class ExpansionGreedyScheduleGeneratorSpec extends AnyFlatSpec with MockFactory PortIdentity(1), inference.operatorIdentifier, PortIdentity(1) - ), - LogicalLink( - inference.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), new WorkflowContext() diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/BatchSizePropagationSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/BatchSizePropagationSpec.scala index 1b9a90d978c..a62d37ca284 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/BatchSizePropagationSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/BatchSizePropagationSpec.scala @@ -102,7 +102,7 @@ class BatchSizePropagationSpec } } - "Engine" should "propagate the correct batch size for headerlessCsv->sink workflow" in { + "Engine" should "propagate the correct batch size for headerlessCsv workflow" in { val expectedBatchSize = 1 val customWorkflowSettings = WorkflowSettings(dataTransferBatchSize = expectedBatchSize) @@ -110,18 +110,10 @@ class BatchSizePropagationSpec val context = new WorkflowContext(workflowSettings = customWorkflowSettings) val headerlessCsvOpDesc = TestOperators.headerlessSmallCsvScanOpDesc() - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(headerlessCsvOpDesc, sink), - List( - LogicalLink( - headerlessCsvOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() - ) - ), + List(headerlessCsvOpDesc), + List(), context ) @@ -131,7 +123,7 @@ class BatchSizePropagationSpec verifyBatchSizeInPartitioning(workflowScheduler, 1) } - "Engine" should "propagate the correct batch size for headerlessCsv->keyword->sink workflow" in { + "Engine" should "propagate the correct batch size for headerlessCsv->keyword workflow" in { val expectedBatchSize = 500 val customWorkflowSettings = WorkflowSettings(dataTransferBatchSize = expectedBatchSize) @@ -140,22 +132,15 @@ class BatchSizePropagationSpec val headerlessCsvOpDesc = TestOperators.headerlessSmallCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("column-1", "Asia") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(headerlessCsvOpDesc, keywordOpDesc, sink), + List(headerlessCsvOpDesc, keywordOpDesc), List( LogicalLink( headerlessCsvOpDesc.operatorIdentifier, PortIdentity(), keywordOpDesc.operatorIdentifier, PortIdentity() - ), - LogicalLink( - keywordOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), context @@ -167,7 +152,7 @@ class BatchSizePropagationSpec verifyBatchSizeInPartitioning(workflowScheduler, 500) } - "Engine" should "propagate the correct batch size for csv->keyword->count->sink workflow" in { + "Engine" should "propagate the correct batch size for csv->keyword->count workflow" in { val expectedBatchSize = 100 val customWorkflowSettings = WorkflowSettings(dataTransferBatchSize = expectedBatchSize) @@ -178,10 +163,9 @@ class BatchSizePropagationSpec val keywordOpDesc = TestOperators.keywordSearchOpDesc("Region", "Asia") val countOpDesc = TestOperators.aggregateAndGroupByDesc("Region", AggregationFunction.COUNT, List[String]()) - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(csvOpDesc, keywordOpDesc, countOpDesc, sink), + List(csvOpDesc, keywordOpDesc, countOpDesc), List( LogicalLink( csvOpDesc.operatorIdentifier, @@ -194,12 +178,6 @@ class BatchSizePropagationSpec PortIdentity(), countOpDesc.operatorIdentifier, PortIdentity() - ), - LogicalLink( - countOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), context @@ -211,7 +189,7 @@ class BatchSizePropagationSpec verifyBatchSizeInPartitioning(workflowScheduler, 100) } - "Engine" should "propagate the correct batch size for csv->keyword->averageAndGroupBy->sink workflow" in { + "Engine" should "propagate the correct batch size for csv->keyword->averageAndGroupBy workflow" in { val expectedBatchSize = 300 val customWorkflowSettings = WorkflowSettings(dataTransferBatchSize = expectedBatchSize) @@ -226,10 +204,8 @@ class BatchSizePropagationSpec AggregationFunction.AVERAGE, List[String]("Country") ) - val sink = TestOperators.sinkOpDesc() - val workflow = buildWorkflow( - List(csvOpDesc, keywordOpDesc, averageAndGroupByOpDesc, sink), + List(csvOpDesc, keywordOpDesc, averageAndGroupByOpDesc), List( LogicalLink( csvOpDesc.operatorIdentifier, @@ -242,12 +218,6 @@ class BatchSizePropagationSpec PortIdentity(), averageAndGroupByOpDesc.operatorIdentifier, PortIdentity() - ), - LogicalLink( - averageAndGroupByOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), context @@ -259,7 +229,7 @@ class BatchSizePropagationSpec verifyBatchSizeInPartitioning(workflowScheduler, 300) } - "Engine" should "propagate the correct batch size for csv->(csv->)->join->sink workflow" in { + "Engine" should "propagate the correct batch size for csv->(csv->)->join workflow" in { val expectedBatchSize = 1 val customWorkflowSettings = WorkflowSettings(dataTransferBatchSize = expectedBatchSize) @@ -269,14 +239,12 @@ class BatchSizePropagationSpec val headerlessCsvOpDesc1 = TestOperators.headerlessSmallCsvScanOpDesc() val headerlessCsvOpDesc2 = TestOperators.headerlessSmallCsvScanOpDesc() val joinOpDesc = TestOperators.joinOpDesc("column-1", "column-1") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( List( headerlessCsvOpDesc1, headerlessCsvOpDesc2, - joinOpDesc, - sink + joinOpDesc ), List( LogicalLink( @@ -290,12 +258,6 @@ class BatchSizePropagationSpec PortIdentity(), joinOpDesc.operatorIdentifier, PortIdentity(1) - ), - LogicalLink( - joinOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), context diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala index 11a63cbf2c6..e56e0973e15 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala @@ -69,16 +69,8 @@ class DataProcessingSpec .registerCallback[ExecutionStateUpdate](evt => { if (evt.state == COMPLETED) { results = workflow.logicalPlan.getTerminalOperatorIds - .map(sinkOpId => - (sinkOpId, workflow.logicalPlan.getUpstreamOps(sinkOpId).head.operatorIdentifier) - ) - .filter { - case (_, upstreamOpId) => resultStorage.contains(upstreamOpId) - } - .map { - case (sinkOpId, upstreamOpId) => - (sinkOpId, resultStorage.get(upstreamOpId).get().toList) - } + .filter(terminalOpId => resultStorage.contains(terminalOpId)) + .map(terminalOpId => terminalOpId -> resultStorage.get(terminalOpId).get().toList) .toMap completion.setDone() } @@ -123,67 +115,43 @@ class DataProcessingSpec ("localhost", config.getPort.toString, database, table, username, password) } - "Engine" should "execute headerlessCsv->sink workflow normally" in { + "Engine" should "execute headerlessCsv workflow normally" in { val headerlessCsvOpDesc = TestOperators.headerlessSmallCsvScanOpDesc() - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(headerlessCsvOpDesc, sink), - List( - LogicalLink( - headerlessCsvOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() - ) - ), + List(headerlessCsvOpDesc), + List(), workflowContext ) - val results = executeWorkflow(workflow)(sink.operatorIdentifier) + val results = executeWorkflow(workflow)(headerlessCsvOpDesc.operatorIdentifier) assert(results.size == 100) } - "Engine" should "execute headerlessMultiLineDataCsv-->sink workflow normally" in { + "Engine" should "execute headerlessMultiLineDataCsv workflow normally" in { val headerlessCsvOpDesc = TestOperators.headerlessSmallMultiLineDataCsvScanOpDesc() - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(headerlessCsvOpDesc, sink), - List( - LogicalLink( - headerlessCsvOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() - ) - ), + List(headerlessCsvOpDesc), + List(), workflowContext ) - val results = executeWorkflow(workflow)(sink.operatorIdentifier) + val results = executeWorkflow(workflow)(headerlessCsvOpDesc.operatorIdentifier) assert(results.size == 100) } - "Engine" should "execute jsonl->sink workflow normally" in { + "Engine" should "execute jsonl workflow normally" in { val jsonlOp = TestOperators.smallJSONLScanOpDesc() - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(jsonlOp, sink), - List( - LogicalLink( - jsonlOp.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() - ) - ), + List(jsonlOp), + List(), workflowContext ) - val results = executeWorkflow(workflow)(sink.operatorIdentifier) + val results = executeWorkflow(workflow)(jsonlOp.operatorIdentifier) assert(results.size == 100) for (result <- results) { - val schema = result.asInstanceOf[Tuple].getSchema + val schema = result.getSchema assert(schema.getAttribute("id").getType == AttributeType.LONG) assert(schema.getAttribute("first_name").getType == AttributeType.STRING) assert(schema.getAttribute("flagged").getType == AttributeType.BOOLEAN) @@ -194,27 +162,19 @@ class DataProcessingSpec } - "Engine" should "execute mediumFlattenJsonl->sink workflow normally" in { + "Engine" should "execute mediumFlattenJsonl workflow normally" in { val jsonlOp = TestOperators.mediumFlattenJSONLScanOpDesc() - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(jsonlOp, sink), - List( - LogicalLink( - jsonlOp.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() - ) - ), + List(jsonlOp), + List(), workflowContext ) - val results = executeWorkflow(workflow)(sink.operatorIdentifier) + val results = executeWorkflow(workflow)(jsonlOp.operatorIdentifier) assert(results.size == 1000) for (result <- results) { - val schema = result.asInstanceOf[Tuple].getSchema + val schema = result.getSchema assert(schema.getAttribute("id").getType == AttributeType.LONG) assert(schema.getAttribute("first_name").getType == AttributeType.STRING) assert(schema.getAttribute("flagged").getType == AttributeType.BOOLEAN) @@ -225,24 +185,17 @@ class DataProcessingSpec } } - "Engine" should "execute headerlessCsv->keyword->sink workflow normally" in { + "Engine" should "execute headerlessCsv->keyword workflow normally" in { val headerlessCsvOpDesc = TestOperators.headerlessSmallCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("column-1", "Asia") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(headerlessCsvOpDesc, keywordOpDesc, sink), + List(headerlessCsvOpDesc, keywordOpDesc), List( LogicalLink( headerlessCsvOpDesc.operatorIdentifier, PortIdentity(), keywordOpDesc.operatorIdentifier, PortIdentity() - ), - LogicalLink( - keywordOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), workflowContext @@ -250,42 +203,27 @@ class DataProcessingSpec executeWorkflow(workflow) } - "Engine" should "execute csv->sink workflow normally" in { + "Engine" should "execute csv workflow normally" in { val csvOpDesc = TestOperators.smallCsvScanOpDesc() - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(csvOpDesc, sink), - List( - LogicalLink( - csvOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() - ) - ), + List(csvOpDesc), + List(), workflowContext ) executeWorkflow(workflow) } - "Engine" should "execute csv->keyword->sink workflow normally" in { + "Engine" should "execute csv->keyword workflow normally" in { val csvOpDesc = TestOperators.smallCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("Region", "Asia") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(csvOpDesc, keywordOpDesc, sink), + List(csvOpDesc, keywordOpDesc), List( LogicalLink( csvOpDesc.operatorIdentifier, PortIdentity(), keywordOpDesc.operatorIdentifier, PortIdentity() - ), - LogicalLink( - keywordOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), workflowContext @@ -298,9 +236,8 @@ class DataProcessingSpec val keywordOpDesc = TestOperators.keywordSearchOpDesc("Region", "Asia") val countOpDesc = TestOperators.aggregateAndGroupByDesc("Region", AggregationFunction.COUNT, List[String]()) - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(csvOpDesc, keywordOpDesc, countOpDesc, sink), + List(csvOpDesc, keywordOpDesc, countOpDesc), List( LogicalLink( csvOpDesc.operatorIdentifier, @@ -313,12 +250,6 @@ class DataProcessingSpec PortIdentity(), countOpDesc.operatorIdentifier, PortIdentity() - ), - LogicalLink( - countOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), workflowContext @@ -326,7 +257,7 @@ class DataProcessingSpec executeWorkflow(workflow) } - "Engine" should "execute csv->keyword->averageAndGroupBy->sink workflow normally" in { + "Engine" should "execute csv->keyword->averageAndGroupBy workflow normally" in { val csvOpDesc = TestOperators.smallCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("Region", "Asia") val averageAndGroupByOpDesc = @@ -335,9 +266,8 @@ class DataProcessingSpec AggregationFunction.AVERAGE, List[String]("Country") ) - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(csvOpDesc, keywordOpDesc, averageAndGroupByOpDesc, sink), + List(csvOpDesc, keywordOpDesc, averageAndGroupByOpDesc), List( LogicalLink( csvOpDesc.operatorIdentifier, @@ -350,12 +280,6 @@ class DataProcessingSpec PortIdentity(), averageAndGroupByOpDesc.operatorIdentifier, PortIdentity() - ), - LogicalLink( - averageAndGroupByOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), workflowContext @@ -363,17 +287,15 @@ class DataProcessingSpec executeWorkflow(workflow) } - "Engine" should "execute csv->(csv->)->join->sink workflow normally" in { + "Engine" should "execute csv->(csv->)->join workflow normally" in { val headerlessCsvOpDesc1 = TestOperators.headerlessSmallCsvScanOpDesc() val headerlessCsvOpDesc2 = TestOperators.headerlessSmallCsvScanOpDesc() val joinOpDesc = TestOperators.joinOpDesc("column-1", "column-1") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( List( headerlessCsvOpDesc1, headerlessCsvOpDesc2, - joinOpDesc, - sink + joinOpDesc ), List( LogicalLink( @@ -387,12 +309,6 @@ class DataProcessingSpec PortIdentity(), joinOpDesc.operatorIdentifier, PortIdentity(1) - ), - LogicalLink( - joinOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), workflowContext @@ -401,20 +317,17 @@ class DataProcessingSpec } // TODO: use mock data to perform the test, remove dependency on the real AsterixDB - // "Engine" should "execute asterixdb->sink workflow normally" in { + // "Engine" should "execute asterixdb workflow normally" in { // // val asterixDBOp = TestOperators.asterixDBSourceOpDesc() - // val sink = TestOperators.sinkOpDesc() // val (id, workflow) = buildWorkflow( - // List(asterixDBOp, sink), - // List( - // OperatorLink(OperatorPort(asterixDBOp.operatorIdentifier, 0), OperatorPort(sink.operatorIdentifier, 0)) - // ) + // List(asterixDBOp), + // List() // ) // executeWorkflow(id, workflow) // } - "Engine" should "execute mysql->sink workflow normally" in { + "Engine" should "execute mysql workflow normally" in { val (host, port, database, table, username, password) = initializeInMemoryMySQLInstance() val inMemoryMsSQLSourceOpDesc = TestOperators.inMemoryMySQLSourceOpDesc( host, @@ -425,17 +338,9 @@ class DataProcessingSpec password ) - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(inMemoryMsSQLSourceOpDesc, sink), - List( - LogicalLink( - inMemoryMsSQLSourceOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() - ) - ), + List(inMemoryMsSQLSourceOpDesc), + List(), workflowContext ) executeWorkflow(workflow) diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/PauseSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/PauseSpec.scala index cc668fbc49c..014f3080b98 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/PauseSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/PauseSpec.scala @@ -73,44 +73,29 @@ class PauseSpec Await.result(completion) } - "Engine" should "be able to pause csv->sink workflow" in { + "Engine" should "be able to pause csv workflow" in { val csvOpDesc = TestOperators.mediumCsvScanOpDesc() - val sink = TestOperators.sinkOpDesc() - logger.info(s"csv-id ${csvOpDesc.operatorIdentifier}, sink-id ${sink.operatorIdentifier}") + logger.info(s"csv-id ${csvOpDesc.operatorIdentifier}") shouldPause( - List(csvOpDesc, sink), - List( - LogicalLink( - csvOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() - ) - ) + List(csvOpDesc), + List() ) } - "Engine" should "be able to pause csv->keyword->sink workflow" in { + "Engine" should "be able to pause csv->keyword workflow" in { val csvOpDesc = TestOperators.mediumCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("Region", "Asia") - val sink = TestOperators.sinkOpDesc() logger.info( - s"csv-id ${csvOpDesc.operatorIdentifier}, keyword-id ${keywordOpDesc.operatorIdentifier}, sink-id ${sink.operatorIdentifier}" + s"csv-id ${csvOpDesc.operatorIdentifier}, keyword-id ${keywordOpDesc.operatorIdentifier}" ) shouldPause( - List(csvOpDesc, keywordOpDesc, sink), + List(csvOpDesc, keywordOpDesc), List( LogicalLink( csvOpDesc.operatorIdentifier, PortIdentity(), keywordOpDesc.operatorIdentifier, PortIdentity() - ), - LogicalLink( - keywordOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ) ) diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/CheckpointSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/CheckpointSpec.scala index d697a4946e6..6c694b989c1 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/CheckpointSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/CheckpointSpec.scala @@ -24,21 +24,14 @@ class CheckpointSpec extends AnyFlatSpecLike with BeforeAndAfterAll { val resultStorage = new OpResultStorage() val csvOpDesc = TestOperators.mediumCsvScanOpDesc() val keywordOpDesc = TestOperators.keywordSearchOpDesc("Region", "Asia") - val sink = TestOperators.sinkOpDesc() val workflow = buildWorkflow( - List(csvOpDesc, keywordOpDesc, sink), + List(csvOpDesc, keywordOpDesc), List( LogicalLink( csvOpDesc.operatorIdentifier, PortIdentity(), keywordOpDesc.operatorIdentifier, PortIdentity() - ), - LogicalLink( - keywordOpDesc.operatorIdentifier, - PortIdentity(), - sink.operatorIdentifier, - PortIdentity() ) ), new WorkflowContext() diff --git a/core/amber/src/test/scala/edu/uci/ics/texera/workflow/SchemaPropagationSpec.scala b/core/amber/src/test/scala/edu/uci/ics/texera/workflow/SchemaPropagationSpec.scala index c680177e28c..cb77ae266aa 100644 --- a/core/amber/src/test/scala/edu/uci/ics/texera/workflow/SchemaPropagationSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/texera/workflow/SchemaPropagationSpec.scala @@ -4,7 +4,6 @@ import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, WorkflowContext} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.OperatorInfo -import edu.uci.ics.amber.operator.sink.SinkOpDesc import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, OperatorIdentity, WorkflowIdentity} import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} @@ -24,7 +23,7 @@ class SchemaPropagationSpec extends AnyFlatSpec with BeforeAndAfter { OperatorInfo("", "", "", List(InputPort()), List(OutputPort())) } - private class TempTestSinkOpDesc extends SinkOpDesc { + private class TempTestSinkOpDesc extends LogicalOp { override def getPhysicalOp( workflowId: WorkflowIdentity, executionId: ExecutionIdentity diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala index 6f32150311b..8a23c681ef7 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.compiler.WorkflowCompiler.{ collectInputSchemaFromPhysicalPlan, convertErrorListToWorkflowFatalErrorMap } + import edu.uci.ics.amber.compiler.model.{LogicalPlan, LogicalPlanPojo} -import edu.uci.ics.amber.compiler.util.SinkInjectionTransformer import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalPlan, WorkflowContext} import edu.uci.ics.amber.virtualidentity.OperatorIdentity @@ -63,7 +63,7 @@ object WorkflowCompiler { errorList: ArrayBuffer[(OperatorIdentity, Throwable)] // Mandatory error list ): Map[OperatorIdentity, List[Option[Schema]]] = { val physicalInputSchemas = - physicalPlan.operators.filter(op => !op.isSinkOperator).map { physicalOp => + physicalPlan.operators.map { physicalOp => // Process inputPorts and capture Throwable values in the errorList physicalOp.id -> physicalOp.inputPorts.values .filterNot(_._1.id.internal) @@ -164,19 +164,14 @@ class WorkflowCompiler( val errorList = new ArrayBuffer[(OperatorIdentity, Throwable)]() var opIdToInputSchema: Map[OperatorIdentity, List[Option[Schema]]] = Map() // 1. convert the pojo to logical plan - var logicalPlan: LogicalPlan = LogicalPlan(logicalPlanPojo) + val logicalPlan: LogicalPlan = LogicalPlan(logicalPlanPojo) - // 2. Manipulate logical plan by: - // - inject sink - logicalPlan = SinkInjectionTransformer.transform( - logicalPlanPojo.opsToViewResult, - logicalPlan - ) // - resolve the file name in each scan source operator logicalPlan.resolveScanSourceOpFileName(Some(errorList)) - // 3. expand the logical plan to the physical plan, + // 3. expand the logical plan to the physical plan val physicalPlan = expandLogicalPlan(logicalPlan, Some(errorList)) + if (errorList.isEmpty) { // no error during the expansion, then do: // - collect the input schema for each op diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/util/SinkInjectionTransformer.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/util/SinkInjectionTransformer.scala deleted file mode 100644 index 75d952590c0..00000000000 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/util/SinkInjectionTransformer.scala +++ /dev/null @@ -1,73 +0,0 @@ -package edu.uci.ics.amber.compiler.util - -import edu.uci.ics.amber.compiler.model.LogicalPlan -import edu.uci.ics.amber.operator.sink.SinkOpDesc -import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PortIdentity - -object SinkInjectionTransformer { - - def transform(opsToViewResult: List[String], oldPlan: LogicalPlan): LogicalPlan = { - var logicalPlan = oldPlan - - // for any terminal operator without a sink, add a sink - val nonSinkTerminalOps = logicalPlan.getTerminalOperatorIds.filter(opId => - !logicalPlan.getOperator(opId).isInstanceOf[SinkOpDesc] - ) - // for any operators marked as view result without a sink, add a sink - val viewResultOps = opsToViewResult - .map(idString => OperatorIdentity(idString)) - .filter(opId => !logicalPlan.getDownstreamOps(opId).exists(op => op.isInstanceOf[SinkOpDesc])) - - val operatorsToAddSink = (nonSinkTerminalOps ++ viewResultOps).toSet - operatorsToAddSink.foreach(opId => { - val op = logicalPlan.getOperator(opId) - op.operatorInfo.outputPorts.foreach(outPort => { - val sink = new ProgressiveSinkOpDesc() - sink.setOperatorId("sink_" + opId.id) - logicalPlan = logicalPlan - .addOperator(sink) - .addLink( - op.operatorIdentifier, - outPort.id, - sink.operatorIdentifier, - toPortId = PortIdentity() - ) - }) - }) - - // check precondition: all the terminal operators should be sinks - assert( - logicalPlan.getTerminalOperatorIds.forall(o => - logicalPlan.getOperator(o).isInstanceOf[SinkOpDesc] - ) - ) - - // for each sink: - // set the corresponding upstream ID and port - // set output mode based on the visualization operator before it - logicalPlan.getTerminalOperatorIds.foreach(sinkOpId => { - val sinkOp = logicalPlan.getOperator(sinkOpId).asInstanceOf[ProgressiveSinkOpDesc] - val upstream = logicalPlan.getUpstreamOps(sinkOpId).headOption - val edge = logicalPlan.links.find(l => - l.fromOpId == upstream.map(_.operatorIdentifier).orNull - && l.toOpId == sinkOpId - ) - assert(upstream.nonEmpty) - if (upstream.nonEmpty && edge.nonEmpty) { - // set upstream ID and port - sinkOp.setUpstreamId(upstream.get.operatorIdentifier) - sinkOp.setUpstreamPort(edge.get.fromPortId.id) - - // set output mode for visualization operator - val outputPort = - upstream.get.operatorInfo.outputPorts.find(port => port.id == edge.get.fromPortId).get - sinkOp.setOutputMode(outputPort.mode) - } - }) - - logicalPlan - } - -} diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala index 3485fb1285d..ab69d6f94d2 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala @@ -7,6 +7,7 @@ import edu.uci.ics.amber.core.tuple.{Schema, Tuple} import edu.uci.ics.amber.virtualidentity.OperatorIdentity import java.util.concurrent.ConcurrentHashMap +import scala.collection.convert.ImplicitConversions.`iterator asScala` object OpResultStorage { val defaultStorageMode: String = StorageConfig.resultStorageMode.toLowerCase @@ -55,6 +56,7 @@ class OpResultStorage extends Serializable with LazyLogging { mode: String, schema: Option[Schema] = None ): VirtualDocument[Tuple] = { + val storage: VirtualDocument[Tuple] = if (mode == "memory") { new MemoryDocument[Tuple](key.id) @@ -98,4 +100,8 @@ class OpResultStorage extends Serializable with LazyLogging { cache.clear() } + def getAllKeys: Set[OperatorIdentity] = { + cache.keySet().iterator().toSet + } + } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala index ac88f098cd3..925fc5314a0 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala @@ -37,7 +37,6 @@ import edu.uci.ics.amber.operator.randomksampling.RandomKSamplingOpDesc import edu.uci.ics.amber.operator.regex.RegexOpDesc import edu.uci.ics.amber.operator.reservoirsampling.ReservoirSamplingOpDesc import edu.uci.ics.amber.operator.sentiment.SentimentAnalysisOpDesc -import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc import edu.uci.ics.amber.operator.sklearn.{ SklearnAdaptiveBoostingOpDesc, SklearnBaggingOpDesc, @@ -158,7 +157,6 @@ trait StateTransferFunc value = classOf[TwitterSearchSourceOpDesc], name = "TwitterSearch" ), - new Type(value = classOf[ProgressiveSinkOpDesc], name = "SimpleSink"), new Type(value = classOf[CandlestickChartOpDesc], name = "CandlestickChart"), new Type(value = classOf[SplitOpDesc], name = "Split"), new Type(value = classOf[ContourPlotOpDesc], name = "ContourPlot"), diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala new file mode 100644 index 00000000000..0024993166f --- /dev/null +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala @@ -0,0 +1,71 @@ +package edu.uci.ics.amber.operator + +import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.tuple.Schema +import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.operator.sink.ProgressiveUtils +import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpExec +import edu.uci.ics.amber.virtualidentity.{ + ExecutionIdentity, + OperatorIdentity, + PhysicalOpIdentity, + WorkflowIdentity +} +import edu.uci.ics.amber.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.workflow.OutputPort.OutputMode.{SET_DELTA, SET_SNAPSHOT, SINGLE_SNAPSHOT} +import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} + +object SpecialPhysicalOpFactory { + def newSinkPhysicalOp( + workflowIdentity: WorkflowIdentity, + executionIdentity: ExecutionIdentity, + storageKey: String, + outputMode: OutputMode + ): PhysicalOp = + PhysicalOp + .localPhysicalOp( + PhysicalOpIdentity(OperatorIdentity(storageKey), "sink"), + workflowIdentity, + executionIdentity, + OpExecInitInfo((idx, workers) => + new ProgressiveSinkOpExec( + outputMode, + storageKey, + workflowIdentity + ) + ) + ) + .withInputPorts(List(InputPort(PortIdentity(internal = true)))) + .withOutputPorts(List(OutputPort(PortIdentity(internal = true)))) + .withPropagateSchema( + SchemaPropagationFunc((inputSchemas: Map[PortIdentity, Schema]) => { + // Get the first schema from inputSchemas + val inputSchema = inputSchemas.values.head + + // Define outputSchema based on outputMode + val outputSchema = outputMode match { + case SET_SNAPSHOT | SINGLE_SNAPSHOT => + if (inputSchema.containsAttribute(ProgressiveUtils.insertRetractFlagAttr.getName)) { + // with insert/retract delta: remove the flag column + Schema + .builder() + .add(inputSchema) + .remove(ProgressiveUtils.insertRetractFlagAttr.getName) + .build() + } else { + // with insert-only delta: output schema is the same as input schema + inputSchema + } + + case SET_DELTA => + // output schema is the same as input schema + inputSchema + case _ => + throw new UnsupportedOperationException(s"Output mode $outputMode is not supported.") + } + + // Create a Scala immutable Map + Map(PortIdentity(internal = true) -> outputSchema) + }) + ) +} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/TestOperators.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/TestOperators.scala index 890eed672f5..bf03e272577 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/TestOperators.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/TestOperators.scala @@ -8,7 +8,6 @@ import edu.uci.ics.amber.operator.aggregate.{ } import edu.uci.ics.amber.operator.hashJoin.HashJoinOpDesc import edu.uci.ics.amber.operator.keywordSearch.KeywordSearchOpDesc -import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpDesc import edu.uci.ics.amber.operator.source.scan.csv.CSVScanSourceOpDesc import edu.uci.ics.amber.operator.source.scan.json.JSONLScanSourceOpDesc import edu.uci.ics.amber.operator.source.sql.asterixdb.AsterixDBSourceOpDesc @@ -144,10 +143,6 @@ object TestOperators { asterixDBOp } - def sinkOpDesc(): ProgressiveSinkOpDesc = { - new ProgressiveSinkOpDesc() - } - def pythonOpDesc(): PythonUDFOpDescV2 = { val udf = new PythonUDFOpDescV2() udf.workers = 1 diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/SinkOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/SinkOpDesc.scala deleted file mode 100644 index da190bcb7e4..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/SinkOpDesc.scala +++ /dev/null @@ -1,5 +0,0 @@ -package edu.uci.ics.amber.operator.sink - -import edu.uci.ics.amber.operator.LogicalOp - -abstract class SinkOpDesc extends LogicalOp diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpDesc.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpDesc.java deleted file mode 100644 index 1f63fb86ab4..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpDesc.java +++ /dev/null @@ -1,158 +0,0 @@ -package edu.uci.ics.amber.operator.sink.managed; - -import com.fasterxml.jackson.annotation.JsonIgnore; -import com.google.common.base.Preconditions; -import edu.uci.ics.amber.core.executor.OpExecInitInfo; -import edu.uci.ics.amber.core.executor.OperatorExecutor; -import edu.uci.ics.amber.core.tuple.Schema; -import edu.uci.ics.amber.core.workflow.PhysicalOp; -import edu.uci.ics.amber.core.workflow.SchemaPropagationFunc; -import edu.uci.ics.amber.operator.metadata.OperatorGroupConstants; -import edu.uci.ics.amber.operator.metadata.OperatorInfo; -import edu.uci.ics.amber.operator.sink.ProgressiveUtils; -import edu.uci.ics.amber.operator.sink.SinkOpDesc; -import edu.uci.ics.amber.operator.util.OperatorDescriptorUtils; -import edu.uci.ics.amber.virtualidentity.ExecutionIdentity; -import edu.uci.ics.amber.virtualidentity.OperatorIdentity; -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity; -import edu.uci.ics.amber.workflow.InputPort; -import edu.uci.ics.amber.workflow.OutputPort; -import edu.uci.ics.amber.workflow.PortIdentity; -import scala.Option; -import scala.Tuple2; -import scala.collection.immutable.Map; - -import java.io.Serializable; -import java.util.ArrayList; -import java.util.function.Function; - - -import static java.util.Collections.singletonList; -import static scala.jdk.javaapi.CollectionConverters.asScala; - -public class ProgressiveSinkOpDesc extends SinkOpDesc { - - // use SET_SNAPSHOT as the default output mode - // this will be set internally by the workflow compiler - @JsonIgnore - private OutputPort.OutputMode outputMode = OutputPort.OutputMode$.MODULE$.fromValue(0); - - - // corresponding upstream operator ID and output port, will be set by workflow compiler - @JsonIgnore - private Option upstreamId = Option.empty(); - - @JsonIgnore - private Option upstreamPort = Option.empty(); - - @Override - public PhysicalOp getPhysicalOp(WorkflowIdentity workflowId, ExecutionIdentity executionId) { - - return PhysicalOp.localPhysicalOp( - workflowId, - executionId, - operatorIdentifier(), - OpExecInitInfo.apply( - (Function, OperatorExecutor> & java.io.Serializable) - worker -> new edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpExec(outputMode, this.getUpstreamId().get().id(), workflowId) - ) - ) - .withInputPorts(this.operatorInfo().inputPorts()) - .withOutputPorts(this.operatorInfo().outputPorts()) - .withPropagateSchema( - SchemaPropagationFunc.apply((Function, Map> & Serializable) inputSchemas -> { - // Initialize a Java HashMap - java.util.Map javaMap = new java.util.HashMap<>(); - - Schema inputSchema = inputSchemas.values().head(); - - // SET_SNAPSHOT: - Schema outputSchema; - if (this.outputMode.equals(OutputPort.OutputMode$.MODULE$.fromValue(0))) { - if (inputSchema.containsAttribute(ProgressiveUtils.insertRetractFlagAttr().getName())) { - // input is insert/retract delta: the flag column is removed in output - outputSchema = Schema.builder().add(inputSchema) - .remove(ProgressiveUtils.insertRetractFlagAttr().getName()).build(); - } else { - // input is insert-only delta: output schema is the same as input schema - outputSchema = inputSchema; - } - } else { - // SET_DELTA: output schema is always the same as input schema - outputSchema = inputSchema; - } - - javaMap.put(operatorInfo().outputPorts().head().id(), outputSchema); - // Convert the Java Map to a Scala immutable Map - return OperatorDescriptorUtils.toImmutableMap(javaMap); - }) - ); - } - - @Override - public OperatorInfo operatorInfo() { - return new OperatorInfo( - "View Results", - "View the results", - OperatorGroupConstants.UTILITY_GROUP(), - asScala(singletonList(new InputPort(new PortIdentity(0, false), "", false, asScala(new ArrayList()).toSeq()))).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), - false, - false, - false, - false); - } - - @Override - public Schema getOutputSchema(Schema[] schemas) { - Preconditions.checkArgument(schemas.length == 1); - Schema inputSchema = schemas[0]; - - // SET_SNAPSHOT: - if (this.outputMode.equals(OutputPort.OutputMode$.MODULE$.fromValue(0))) { - if (inputSchema.containsAttribute(ProgressiveUtils.insertRetractFlagAttr().getName())) { - // input is insert/retract delta: the flag column is removed in output - return Schema.builder().add(inputSchema) - .remove(ProgressiveUtils.insertRetractFlagAttr().getName()).build(); - } else { - // input is insert-only delta: output schema is the same as input schema - return inputSchema; - } - } else { - // SET_DELTA: output schema is always the same as input schema - return inputSchema; - } - } - - @JsonIgnore - public OutputPort.OutputMode getOutputMode() { - return outputMode; - } - - @JsonIgnore - public void setOutputMode(OutputPort.OutputMode outputMode) { - this.outputMode = outputMode; - } - - @JsonIgnore - public Option getUpstreamId() { - return upstreamId; - } - - @JsonIgnore - public void setUpstreamId(OperatorIdentity upstreamId) { - this.upstreamId = Option.apply(upstreamId); - } - - @JsonIgnore - public Option getUpstreamPort() { - return upstreamPort; - } - - @JsonIgnore - public void setUpstreamPort(Integer upstreamPort) { - this.upstreamPort = Option.apply(upstreamPort); - } - - -} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala index a9408013d67..aaeababd685 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala @@ -28,6 +28,7 @@ class ProgressiveSinkOpExec( outputMode match { case OutputMode.SET_SNAPSHOT | OutputMode.SINGLE_SNAPSHOT => updateSetSnapshot(tuple) case OutputMode.SET_DELTA => writer.putOne(tuple) + case _ => throw new UnsupportedOperationException("Unsupported output mode") } } From 8a97cbb550cc8e14e6add8b2119e5c5202252a28 Mon Sep 17 00:00:00 2001 From: Jiadong Bai <43344272+bobbai00@users.noreply.github.com> Date: Sun, 22 Dec 2024 09:28:50 -0800 Subject: [PATCH 12/47] Fix the schema fetching during sink storage assignment (#3174) This PR fixes the issue that, when MongoDB is used as the result storage, the execution keeps failing. The root cause is: when using MongoDB as the result storage, the schema has to be extracted from the physical op the code implementation uses the physical ops in subPlan to extract the schema out, which is incorrect as subPlan's physical ops are not propagated for output schemas How the fix is done is: using physicalPlan.getOperator, instead of subPlan.getOperator. In this way, the operator that is flowing to the downstream is from `physicalPlan`, not `subPlan` --- .../scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala index 9144fa63d48..fad33f628a0 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala @@ -71,7 +71,7 @@ class WorkflowCompiler( // assign the sinks to toAddSink operators' external output ports subPlan .topologicalIterator() - .map(subPlan.getOperator) + .map(physicalPlan.getOperator) .flatMap { physicalOp => physicalOp.outputPorts.map(outputPort => (physicalOp, outputPort)) } From 5267fecf0a3bc4f3ffdece6063387be521938f3d Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Fri, 27 Dec 2024 14:54:18 -0800 Subject: [PATCH 13/47] Fix view result (#3176) This PR fixes an issue where operators marked for viewing results did not display the expected results due to a mismatch caused by string-based comparison instead of using operator identities. By updating the matching logic to rely on operator identities, this change ensures accurate identification of marked operators and correct display of their results. --- .../scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala index fad33f628a0..da13af7e53a 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala @@ -28,7 +28,7 @@ class WorkflowCompiler( errorList: Option[ArrayBuffer[(OperatorIdentity, Throwable)]] ): PhysicalPlan = { val terminalLogicalOps = logicalPlan.getTerminalOperatorIds - val toAddSink = (terminalLogicalOps ++ logicalOpsToViewResult).toSet + val toAddSink = (terminalLogicalOps ++ logicalOpsToViewResult.map(OperatorIdentity(_))).toSet var physicalPlan = PhysicalPlan(operators = Set.empty, links = Set.empty) // create a JSON object that holds pointers to the workflow's results in Mongo val resultsJSON = objectMapper.createObjectNode() From d85ce7aa768c21faf1f0ca114c0cb2e6e7cc2848 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Sun, 29 Dec 2024 18:11:21 -0800 Subject: [PATCH 14/47] Remove logical schema propagation (#3177) Schema propagation is now handled in the physical plan to ensure all ports, both external and internal, have an associated schema. As a result, schema propagation in the logical plan is no longer necessary. In addition, workflow context was used to give user information so that schema propagation can resolve file names. This is also no longer needed as we are resolving files through an explicit call as a step of compiling. WorkflowContext is no longer needed to be set to Logical Operator. --- .../ExecutionReconfigurationService.scala | 1 - .../uci/ics/texera/workflow/LogicalPlan.scala | 123 ----------- .../uci/ics/texera/workflow/LogicalPort.scala | 25 --- .../workflow/SinkInjectionTransformer.scala | 1 - .../texera/workflow/WorkflowCompiler.scala | 6 +- .../workflow/SchemaPropagationSpec.scala | 199 ------------------ .../ics/amber/compiler/WorkflowCompiler.scala | 3 +- .../amber/compiler/model/LogicalPlan.scala | 106 ---------- .../uci/ics/amber/operator/LogicalOp.scala | 49 +---- .../operator/hashJoin/HashJoinOpDesc.scala | 2 +- .../sentiment/SentimentAnalysisOpDesc.scala | 4 +- .../source/scan/ScanSourceOpDesc.scala | 5 - .../scan/csv/CSVScanSourceOpDescSpec.scala | 8 - 13 files changed, 11 insertions(+), 521 deletions(-) delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPort.scala delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala delete mode 100644 core/amber/src/test/scala/edu/uci/ics/texera/workflow/SchemaPropagationSpec.scala diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionReconfigurationService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionReconfigurationService.scala index 393fd4b7433..9fa342d470a 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionReconfigurationService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionReconfigurationService.scala @@ -55,7 +55,6 @@ class ExecutionReconfigurationService( // they are not actually performed until the workflow is resumed def modifyOperatorLogic(modifyLogicRequest: ModifyLogicRequest): TexeraWebSocketEvent = { val newOp = modifyLogicRequest.operator - newOp.setContext(workflow.context) val opId = newOp.operatorIdentifier val currentOp = workflow.logicalPlan.getOperator(opId) val reconfiguredPhysicalOp = diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala index 27d2e504ddf..46bbb441cbe 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala @@ -2,19 +2,14 @@ package edu.uci.ics.texera.workflow import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.storage.FileResolver -import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.WorkflowContext import edu.uci.ics.amber.operator.LogicalOp -import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PortIdentity import edu.uci.ics.texera.web.model.websocket.request.LogicalPlanPojo import org.jgrapht.graph.DirectedAcyclicGraph import org.jgrapht.util.SupplierUtil import java.util -import scala.collection.mutable import scala.collection.mutable.ArrayBuffer import scala.jdk.CollectionConverters.SetHasAsScala import scala.util.{Failure, Success, Try} @@ -48,7 +43,6 @@ object LogicalPlan { ): LogicalPlan = { LogicalPlan(pojo.operators, pojo.links) } - } case class LogicalPlan( @@ -64,22 +58,13 @@ case class LogicalPlan( def getTopologicalOpIds: util.Iterator[OperatorIdentity] = jgraphtDag.iterator() - def getOperator(opId: String): LogicalOp = operatorMap(OperatorIdentity(opId)) - def getOperator(opId: OperatorIdentity): LogicalOp = operatorMap(opId) - def getSourceOperatorIds: List[OperatorIdentity] = - operatorMap.keys.filter(op => jgraphtDag.inDegreeOf(op) == 0).toList - def getTerminalOperatorIds: List[OperatorIdentity] = operatorMap.keys .filter(op => jgraphtDag.outDegreeOf(op) == 0) .toList - def getAncestorOpIds(opId: OperatorIdentity): Set[OperatorIdentity] = { - jgraphtDag.getAncestors(opId).asScala.toSet - } - def getUpstreamOps(opId: OperatorIdentity): List[LogicalOp] = { jgraphtDag .incomingEdgesOf(opId) @@ -88,64 +73,10 @@ case class LogicalPlan( .toList } - def addOperator(op: LogicalOp): LogicalPlan = { - // TODO: fix schema for the new operator - this.copy(operators :+ op, links) - } - - def removeOperator(opId: OperatorIdentity): LogicalPlan = { - this.copy( - operators.filter(o => o.operatorIdentifier != opId), - links.filter(l => l.fromOpId != opId && l.toOpId != opId) - ) - } - - def addLink( - fromOpId: OperatorIdentity, - fromPortId: PortIdentity, - toOpId: OperatorIdentity, - toPortId: PortIdentity - ): LogicalPlan = { - val newLink = LogicalLink( - fromOpId, - fromPortId, - toOpId, - toPortId - ) - val newLinks = links :+ newLink - this.copy(operators, newLinks) - } - - def removeLink(linkToRemove: LogicalLink): LogicalPlan = { - this.copy(operators, links.filter(l => l != linkToRemove)) - } - - def getDownstreamOps(opId: OperatorIdentity): List[LogicalOp] = { - val downstream = new mutable.ArrayBuffer[LogicalOp] - jgraphtDag - .outgoingEdgesOf(opId) - .forEach(e => downstream += operatorMap(e.toOpId)) - downstream.toList - } - - def getDownstreamLinks(opId: OperatorIdentity): List[LogicalLink] = { - links.filter(l => l.fromOpId == opId) - } - def getUpstreamLinks(opId: OperatorIdentity): List[LogicalLink] = { links.filter(l => l.toOpId == opId) } - def getInputSchemaMap: Map[OperatorIdentity, List[Option[Schema]]] = { - operators - .map(operator => { - operator.operatorIdentifier -> operator.operatorInfo.inputPorts.map(inputPort => - operator.inputPortToSchemaMapping.get(inputPort.id) - ) - }) - .toMap - } - /** * Resolve all user-given filename for the scan source operators to URIs, and call op.setFileUri to set the URi * @@ -180,58 +111,4 @@ case class LogicalPlan( case _ => // Skip non-ScanSourceOpDesc operators } } - - def propagateWorkflowSchema( - context: WorkflowContext, - errorList: Option[ArrayBuffer[(OperatorIdentity, Throwable)]] - ): Unit = { - - operators.foreach(operator => { - if (operator.getContext == null) { - operator.setContext(context) - } - }) - - // propagate output schema following topological order - val topologicalOrderIterator = jgraphtDag.iterator() - topologicalOrderIterator.forEachRemaining(opId => { - val op = getOperator(opId) - val inputSchemas: Array[Option[Schema]] = if (op.isInstanceOf[SourceOperatorDescriptor]) { - Array() - } else { - op.operatorInfo.inputPorts - .flatMap(inputPort => { - links - .filter(link => link.toOpId == op.operatorIdentifier && link.toPortId == inputPort.id) - .map(link => { - val outputSchemaOpt = - getOperator(link.fromOpId).outputPortToSchemaMapping.get(link.fromPortId) - if (outputSchemaOpt.isDefined) { - op.inputPortToSchemaMapping(inputPort.id) = outputSchemaOpt.get - } - outputSchemaOpt - }) - }) - .toArray - } - - if (!inputSchemas.contains(None)) { - Try(op.getOutputSchemas(inputSchemas.map(_.get))) match { - case Success(outputSchemas) => - op.operatorInfo.outputPorts.foreach(outputPort => - op.outputPortToSchemaMapping(outputPort.id) = outputSchemas(outputPort.id.id) - ) - assert(outputSchemas.length == op.operatorInfo.outputPorts.length) - case Failure(err) => - logger.error("got error", err) - errorList match { - case Some(list) => list.append((opId, err)) - case None => // Throw the error if no errorList is provided - throw err - } - } - - } - }) - } } diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPort.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPort.scala deleted file mode 100644 index c33f29aa729..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPort.scala +++ /dev/null @@ -1,25 +0,0 @@ -package edu.uci.ics.texera.workflow - -import edu.uci.ics.amber.virtualidentity.OperatorIdentity - -case object LogicalPort { - def apply(operatorIdentity: OperatorIdentity, portOrdinal: Integer): LogicalPort = { - LogicalPort(operatorIdentity.id, portOrdinal) - } - - def apply( - operatorIdentity: OperatorIdentity, - portOrdinal: Integer, - portName: String - ): LogicalPort = { - LogicalPort(operatorIdentity.id, portOrdinal, portName) - } -} - -case class LogicalPort( - operatorID: String, - portOrdinal: Integer = 0, - portName: String = "" -) { - def operatorId: OperatorIdentity = OperatorIdentity(operatorID) -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala deleted file mode 100644 index 8b137891791..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/SinkInjectionTransformer.scala +++ /dev/null @@ -1 +0,0 @@ - diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala index da13af7e53a..615cc937dbf 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala @@ -37,7 +37,6 @@ class WorkflowCompiler( logicalPlan.getTopologicalOpIds.asScala.foreach(logicalOpId => Try { val logicalOp = logicalPlan.getOperator(logicalOpId) - logicalOp.setContext(context) val subPlan = logicalOp.getPhysicalPlan(context.workflowId, context.executionId) subPlan @@ -165,10 +164,7 @@ class WorkflowCompiler( // 2. resolve the file name in each scan source operator logicalPlan.resolveScanSourceOpFileName(None) - // 3. Propagate the schema to get the input & output schemas for each port of each operator - logicalPlan.propagateWorkflowSchema(context, None) - - // 4. expand the logical plan to the physical plan, and assign storage + // 3. expand the logical plan to the physical plan, and assign storage val physicalPlan = expandLogicalPlan(logicalPlan, logicalPlanPojo.opsToViewResult, None) Workflow(context, logicalPlan, physicalPlan) diff --git a/core/amber/src/test/scala/edu/uci/ics/texera/workflow/SchemaPropagationSpec.scala b/core/amber/src/test/scala/edu/uci/ics/texera/workflow/SchemaPropagationSpec.scala deleted file mode 100644 index cb77ae266aa..00000000000 --- a/core/amber/src/test/scala/edu/uci/ics/texera/workflow/SchemaPropagationSpec.scala +++ /dev/null @@ -1,199 +0,0 @@ -package edu.uci.ics.texera.workflow - -import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{PhysicalOp, WorkflowContext} -import edu.uci.ics.amber.operator.LogicalOp -import edu.uci.ics.amber.operator.metadata.OperatorInfo -import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, OperatorIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} -import org.apache.arrow.util.Preconditions -import org.scalatest.BeforeAndAfter -import org.scalatest.flatspec.AnyFlatSpec - -class SchemaPropagationSpec extends AnyFlatSpec with BeforeAndAfter { - - private abstract class TempTestSourceOpDesc extends SourceOperatorDescriptor { - override def getPhysicalOp( - workflowId: WorkflowIdentity, - executionId: ExecutionIdentity - ): PhysicalOp = ??? - - override def operatorInfo: OperatorInfo = - OperatorInfo("", "", "", List(InputPort()), List(OutputPort())) - } - - private class TempTestSinkOpDesc extends LogicalOp { - override def getPhysicalOp( - workflowId: WorkflowIdentity, - executionId: ExecutionIdentity - ): PhysicalOp = ??? - - override def operatorInfo: OperatorInfo = - OperatorInfo("", "", "", List(InputPort()), List(OutputPort())) - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.length == 1) - schemas(0) - } - } - - it should "propagate workflow schema with multiple input and output ports" in { - // build the following workflow DAG: - // trainingData ---\ /----> mlVizSink - // testingData ----> mlTrainingOp--< - // inferenceData ---------------------> mlInferenceOp --> inferenceSink - - val dataSchema = Schema.builder().add("dataCol", AttributeType.INTEGER).build() - val trainingScan = new TempTestSourceOpDesc() { - override def operatorIdentifier: OperatorIdentity = OperatorIdentity("trainingScan") - - override def sourceSchema(): Schema = dataSchema - } - - val testingScan = new TempTestSourceOpDesc() { - override def operatorIdentifier: OperatorIdentity = OperatorIdentity("testingScan") - - override def sourceSchema(): Schema = dataSchema - } - - val inferenceScan = new TempTestSourceOpDesc() { - override def operatorIdentifier: OperatorIdentity = OperatorIdentity("inferenceScan") - - override def sourceSchema(): Schema = dataSchema - } - - val mlModelSchema = Schema.builder().add("model", AttributeType.STRING).build() - val mlVizSchema = Schema.builder().add("visualization", AttributeType.STRING).build() - - val mlTrainingOp = new LogicalOp() { - override def operatorIdentifier: OperatorIdentity = OperatorIdentity("mlTrainingOp") - - override def getPhysicalOp( - workflowId: WorkflowIdentity, - executionId: ExecutionIdentity - ): PhysicalOp = ??? - - override def operatorInfo: OperatorInfo = - OperatorInfo( - "", - "", - "", - List( - InputPort(displayName = "training"), - InputPort(PortIdentity(0), displayName = "testing") - ), - List( - OutputPort(displayName = "visualization"), - OutputPort(PortIdentity(1), displayName = "model") - ) - ) - - override def getOutputSchema(schemas: Array[Schema]): Schema = ??? - - override def getOutputSchemas(schemas: Array[Schema]): Array[Schema] = { - Preconditions.checkArgument(schemas.length == 2) - Preconditions.checkArgument(schemas.distinct.length == 1) - Array(mlVizSchema, mlModelSchema) - } - } - - val mlInferOp = new LogicalOp() { - override def operatorIdentifier: OperatorIdentity = OperatorIdentity("mlInferOp") - - override def getPhysicalOp( - workflowId: WorkflowIdentity, - executionId: ExecutionIdentity - ): PhysicalOp = ??? - - override def operatorInfo: OperatorInfo = - OperatorInfo( - "", - "", - "", - List(InputPort(displayName = "model"), InputPort(PortIdentity(1), displayName = "data")), - List(OutputPort(displayName = "data")) - ) - - override def getOutputSchema(schemas: Array[Schema]): Schema = ??? - - override def getOutputSchemas(schemas: Array[Schema]): Array[Schema] = { - Preconditions.checkArgument(schemas.length == 2) - Array(schemas(1)) - } - } - - val mlVizSink = new TempTestSinkOpDesc { - override def operatorIdentifier: OperatorIdentity = OperatorIdentity("mlVizSink") - } - - val inferenceSink = new TempTestSinkOpDesc { - override def operatorIdentifier: OperatorIdentity = OperatorIdentity("inferenceSink") - } - - val operators = List( - trainingScan, - testingScan, - inferenceScan, - mlTrainingOp, - mlInferOp, - mlVizSink, - inferenceSink - ) - - val links = List( - LogicalLink( - trainingScan.operatorIdentifier, - PortIdentity(), - mlTrainingOp.operatorIdentifier, - PortIdentity() - ), - LogicalLink( - testingScan.operatorIdentifier, - PortIdentity(), - mlTrainingOp.operatorIdentifier, - PortIdentity(1) - ), - LogicalLink( - inferenceScan.operatorIdentifier, - PortIdentity(), - mlInferOp.operatorIdentifier, - PortIdentity(1) - ), - LogicalLink( - mlTrainingOp.operatorIdentifier, - PortIdentity(), - mlVizSink.operatorIdentifier, - PortIdentity(0) - ), - LogicalLink( - mlTrainingOp.operatorIdentifier, - PortIdentity(1), - mlInferOp.operatorIdentifier, - PortIdentity() - ), - LogicalLink( - mlInferOp.operatorIdentifier, - PortIdentity(), - inferenceSink.operatorIdentifier, - PortIdentity() - ) - ) - - val ctx = new WorkflowContext() - val logicalPlan = LogicalPlan(operators, links) - logicalPlan.propagateWorkflowSchema(ctx, None) - val schemaResult = logicalPlan.getInputSchemaMap - - assert(schemaResult(mlTrainingOp.operatorIdentifier).head.get.equals(dataSchema)) - assert(schemaResult(mlTrainingOp.operatorIdentifier)(1).get.equals(dataSchema)) - - assert(schemaResult(mlInferOp.operatorIdentifier).head.get.equals(mlModelSchema)) - assert(schemaResult(mlInferOp.operatorIdentifier)(1).get.equals(dataSchema)) - - assert(schemaResult(mlVizSink.operatorIdentifier).head.get.equals(mlVizSchema)) - assert(schemaResult(inferenceSink.operatorIdentifier).head.get.equals(dataSchema)) - - } - -} diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala index 8a23c681ef7..4a50f33f806 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala @@ -110,7 +110,6 @@ class WorkflowCompiler( logicalPlan.getTopologicalOpIds.asScala.foreach(logicalOpId => Try { val logicalOp = logicalPlan.getOperator(logicalOpId) - logicalOp.setContext(context) val subPlan = logicalOp.getPhysicalPlan(context.workflowId, context.executionId) subPlan @@ -166,7 +165,7 @@ class WorkflowCompiler( // 1. convert the pojo to logical plan val logicalPlan: LogicalPlan = LogicalPlan(logicalPlanPojo) - // - resolve the file name in each scan source operator + // 2. resolve the file name in each scan source operator logicalPlan.resolveScanSourceOpFileName(Some(errorList)) // 3. expand the logical plan to the physical plan diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala index c1a23c00ace..8b599176cd7 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala @@ -67,38 +67,11 @@ case class LogicalPlan( def getOperator(opId: OperatorIdentity): LogicalOp = operatorMap(opId) - def getSourceOperatorIds: List[OperatorIdentity] = - operatorMap.keys.filter(op => jgraphtDag.inDegreeOf(op) == 0).toList - - def getTerminalOperatorIds: List[OperatorIdentity] = - operatorMap.keys - .filter(op => jgraphtDag.outDegreeOf(op) == 0) - .toList - - def getAncestorOpIds(opId: OperatorIdentity): Set[OperatorIdentity] = { - jgraphtDag.getAncestors(opId).asScala.toSet - } - - def getUpstreamOps(opId: OperatorIdentity): List[LogicalOp] = { - jgraphtDag - .incomingEdgesOf(opId) - .asScala - .map(e => operatorMap(e.fromOpId)) - .toList - } - def addOperator(op: LogicalOp): LogicalPlan = { // TODO: fix schema for the new operator this.copy(operators :+ op, links) } - def removeOperator(opId: OperatorIdentity): LogicalPlan = { - this.copy( - operators.filter(o => o.operatorIdentifier != opId), - links.filter(l => l.fromOpId != opId && l.toOpId != opId) - ) - } - def addLink( fromOpId: OperatorIdentity, fromPortId: PortIdentity, @@ -115,36 +88,10 @@ case class LogicalPlan( this.copy(operators, newLinks) } - def removeLink(linkToRemove: LogicalLink): LogicalPlan = { - this.copy(operators, links.filter(l => l != linkToRemove)) - } - - def getDownstreamOps(opId: OperatorIdentity): List[LogicalOp] = { - val downstream = new mutable.ArrayBuffer[LogicalOp] - jgraphtDag - .outgoingEdgesOf(opId) - .forEach(e => downstream += operatorMap(e.toOpId)) - downstream.toList - } - - def getDownstreamLinks(opId: OperatorIdentity): List[LogicalLink] = { - links.filter(l => l.fromOpId == opId) - } - def getUpstreamLinks(opId: OperatorIdentity): List[LogicalLink] = { links.filter(l => l.toOpId == opId) } - def getInputSchemaMap: Map[OperatorIdentity, List[Option[Schema]]] = { - operators - .map(operator => { - operator.operatorIdentifier -> operator.operatorInfo.inputPorts.map(inputPort => - operator.inputPortToSchemaMapping.get(inputPort.id) - ) - }) - .toMap - } - /** * Resolve all user-given filename for the scan source operators to URIs, and call op.setFileUri to set the URi * @param errorList if given, put errors during resolving to it @@ -170,57 +117,4 @@ case class LogicalPlan( case _ => // Skip non-ScanSourceOpDesc operators } } - - def propagateWorkflowSchema( - context: WorkflowContext, - errorList: Option[ArrayBuffer[(OperatorIdentity, Throwable)]] - ): Unit = { - - operators.foreach(operator => { - if (operator.getContext == null) { - operator.setContext(context) - } - }) - - // propagate output schema following topological order - val topologicalOrderIterator = jgraphtDag.iterator() - topologicalOrderIterator.forEachRemaining(opId => { - val op = getOperator(opId) - val inputSchemas: Array[Option[Schema]] = if (op.isInstanceOf[SourceOperatorDescriptor]) { - Array() - } else { - op.operatorInfo.inputPorts - .flatMap(inputPort => { - links - .filter(link => link.toOpId == op.operatorIdentifier && link.toPortId == inputPort.id) - .map(link => { - val outputSchemaOpt = - getOperator(link.fromOpId).outputPortToSchemaMapping.get(link.fromPortId) - if (outputSchemaOpt.isDefined) { - op.inputPortToSchemaMapping(inputPort.id) = outputSchemaOpt.get - } - outputSchemaOpt - }) - }) - .toArray - } - - if (!inputSchemas.contains(None)) { - Try(op.getOutputSchemas(inputSchemas.map(_.get))) match { - case Success(outputSchemas) => - op.operatorInfo.outputPorts.foreach(outputPort => - op.outputPortToSchemaMapping(outputPort.id) = outputSchemas(outputPort.id.id) - ) - assert(outputSchemas.length == op.operatorInfo.outputPorts.length) - case Failure(err) => - logger.error("got error", err) - errorList match { - case Some(list) => list.append((opId, err)) - case None => - } - } - - } - }) - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala index 925fc5314a0..e374fac80c7 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala @@ -5,7 +5,7 @@ import com.fasterxml.jackson.annotation._ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{PhysicalOp, PhysicalPlan, WorkflowContext} +import edu.uci.ics.amber.core.workflow.{PhysicalOp, PhysicalPlan} import edu.uci.ics.amber.operator.aggregate.AggregateOpDesc import edu.uci.ics.amber.operator.cartesianProduct.CartesianProductOpDesc import edu.uci.ics.amber.operator.dictionary.DictionaryMatcherOpDesc @@ -37,35 +37,7 @@ import edu.uci.ics.amber.operator.randomksampling.RandomKSamplingOpDesc import edu.uci.ics.amber.operator.regex.RegexOpDesc import edu.uci.ics.amber.operator.reservoirsampling.ReservoirSamplingOpDesc import edu.uci.ics.amber.operator.sentiment.SentimentAnalysisOpDesc -import edu.uci.ics.amber.operator.sklearn.{ - SklearnAdaptiveBoostingOpDesc, - SklearnBaggingOpDesc, - SklearnBernoulliNaiveBayesOpDesc, - SklearnComplementNaiveBayesOpDesc, - SklearnDecisionTreeOpDesc, - SklearnDummyClassifierOpDesc, - SklearnExtraTreeOpDesc, - SklearnExtraTreesOpDesc, - SklearnGaussianNaiveBayesOpDesc, - SklearnGradientBoostingOpDesc, - SklearnKNNOpDesc, - SklearnLinearRegressionOpDesc, - SklearnLinearSVMOpDesc, - SklearnLogisticRegressionCVOpDesc, - SklearnLogisticRegressionOpDesc, - SklearnMultiLayerPerceptronOpDesc, - SklearnMultinomialNaiveBayesOpDesc, - SklearnNearestCentroidOpDesc, - SklearnPassiveAggressiveOpDesc, - SklearnPerceptronOpDesc, - SklearnPredictionOpDesc, - SklearnProbabilityCalibrationOpDesc, - SklearnRandomForestOpDesc, - SklearnRidgeCVOpDesc, - SklearnRidgeOpDesc, - SklearnSDGOpDesc, - SklearnSVMOpDesc -} +import edu.uci.ics.amber.operator.sklearn._ import edu.uci.ics.amber.operator.sort.SortOpDesc import edu.uci.ics.amber.operator.sortPartitions.SortPartitionsOpDesc import edu.uci.ics.amber.operator.source.apis.reddit.RedditSearchSourceOpDesc @@ -75,6 +47,7 @@ import edu.uci.ics.amber.operator.source.apis.twitter.v2.{ } import edu.uci.ics.amber.operator.source.fetcher.URLFetcherOpDesc import edu.uci.ics.amber.operator.source.scan.FileScanSourceOpDesc +import edu.uci.ics.amber.operator.source.scan.arrow.ArrowSourceOpDesc import edu.uci.ics.amber.operator.source.scan.csv.CSVScanSourceOpDesc import edu.uci.ics.amber.operator.source.scan.csvOld.CSVOldScanSourceOpDesc import edu.uci.ics.amber.operator.source.scan.json.JSONLScanSourceOpDesc @@ -121,7 +94,6 @@ import edu.uci.ics.amber.operator.visualization.ternaryPlot.TernaryPlotOpDesc import edu.uci.ics.amber.operator.visualization.urlviz.UrlVizOpDesc import edu.uci.ics.amber.operator.visualization.waterfallChart.WaterfallChartOpDesc import edu.uci.ics.amber.operator.visualization.wordCloud.WordCloudOpDesc -import edu.uci.ics.amber.operator.source.scan.arrow.ArrowSourceOpDesc import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, OperatorIdentity, WorkflowIdentity} import edu.uci.ics.amber.workflow.PortIdentity import org.apache.commons.lang3.builder.{EqualsBuilder, HashCodeBuilder, ToStringBuilder} @@ -306,14 +278,11 @@ trait StateTransferFunc ) abstract class LogicalOp extends PortDescriptor with Serializable { - @JsonIgnore - private var context: WorkflowContext = _ - @JsonProperty(PropertyNameConstants.OPERATOR_ID) private var operatorId: String = getClass.getSimpleName + "-" + UUID.randomUUID.toString @JsonProperty(PropertyNameConstants.OPERATOR_VERSION) - var operatorVersion: String = getOperatorVersion() + var operatorVersion: String = getOperatorVersion @JsonIgnore val inputPortToSchemaMapping: mutable.Map[PortIdentity, Schema] = mutable.HashMap() @@ -334,7 +303,7 @@ abstract class LogicalOp extends PortDescriptor with Serializable { workflowId: WorkflowIdentity, executionId: ExecutionIdentity ): PhysicalPlan = { - new PhysicalPlan( + PhysicalPlan( operators = Set(getPhysicalOp(workflowId, executionId)), links = Set.empty ) @@ -344,7 +313,7 @@ abstract class LogicalOp extends PortDescriptor with Serializable { def getOutputSchema(schemas: Array[Schema]): Schema - private def getOperatorVersion(): String = { + private def getOperatorVersion: String = { val path = "core/amber/src/main/scala/" val operatorPath = path + this.getClass.getPackage.getName.replace(".", "/") OPVersion.getVersion(this.getClass.getSimpleName, operatorPath) @@ -361,12 +330,6 @@ abstract class LogicalOp extends PortDescriptor with Serializable { override def toString: String = ToStringBuilder.reflectionToString(this) - def getContext: WorkflowContext = this.context - - def setContext(workflowContext: WorkflowContext): Unit = { - this.context = workflowContext - } - def setOperatorId(id: String): Unit = { operatorId = id } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala index 3bb144b70a8..c40d736d135 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala @@ -121,7 +121,7 @@ class HashJoinOpDesc[K] extends LogicalOp { ) ) - new PhysicalPlan( + PhysicalPlan( operators = Set(buildPhysicalOp, probePhysicalOp), links = Set( PhysicalLink( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala index a77c67b9143..27a988d9f37 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala @@ -57,8 +57,8 @@ class SentimentAnalysisOpDesc extends MapOpDesc { ) } - override def operatorInfo = - new OperatorInfo( + override def operatorInfo: OperatorInfo = + OperatorInfo( "Sentiment Analysis", "analysis the sentiment of a text using machine learning", OperatorGroupConstants.MACHINE_LEARNING_GROUP, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala index cb779ce96ee..80aee74cf63 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala @@ -4,7 +4,6 @@ import com.fasterxml.jackson.annotation.{JsonIgnore, JsonProperty, JsonPropertyD import com.fasterxml.jackson.databind.annotation.JsonDeserialize import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.WorkflowContext import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor import edu.uci.ics.amber.workflow.OutputPort @@ -54,10 +53,6 @@ abstract class ScanSourceOpDesc extends SourceOperatorDescriptor { inferSchema() } - override def setContext(workflowContext: WorkflowContext): Unit = { - super.setContext(workflowContext) - } - override def operatorInfo: OperatorInfo = { OperatorInfo( userFriendlyName = s"${fileTypeName.getOrElse("Unknown")} File Scan", diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala index 5d4be121d08..06dd412b8b9 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala @@ -2,7 +2,6 @@ package edu.uci.ics.amber.operator.source.scan.csv import edu.uci.ics.amber.core.storage.FileResolver import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.WorkflowContext import edu.uci.ics.amber.core.workflow.WorkflowContext.{DEFAULT_EXECUTION_ID, DEFAULT_WORKFLOW_ID} import edu.uci.ics.amber.operator.TestOperators import edu.uci.ics.amber.workflow.PortIdentity @@ -11,7 +10,6 @@ import org.scalatest.flatspec.AnyFlatSpec class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { - val workflowContext = new WorkflowContext() var csvScanSourceOpDesc: CSVScanSourceOpDesc = _ var parallelCsvScanSourceOpDesc: ParallelCSVScanSourceOpDesc = _ before { @@ -28,7 +26,6 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { parallelCsvScanSourceOpDesc.fileName = Some(TestOperators.CountrySalesSmallCsvPath) parallelCsvScanSourceOpDesc.customDelimiter = Some(",") parallelCsvScanSourceOpDesc.hasHeader = true - parallelCsvScanSourceOpDesc.setContext(workflowContext) parallelCsvScanSourceOpDesc.setFileUri( FileResolver.resolve(parallelCsvScanSourceOpDesc.fileName.get) ) @@ -45,7 +42,6 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { parallelCsvScanSourceOpDesc.fileName = Some(TestOperators.CountrySalesHeaderlessSmallCsvPath) parallelCsvScanSourceOpDesc.customDelimiter = Some(",") parallelCsvScanSourceOpDesc.hasHeader = false - parallelCsvScanSourceOpDesc.setContext(workflowContext) parallelCsvScanSourceOpDesc.setFileUri( FileResolver.resolve(parallelCsvScanSourceOpDesc.fileName.get) ) @@ -62,7 +58,6 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { csvScanSourceOpDesc.fileName = Some(TestOperators.CountrySalesSmallMultiLineCsvPath) csvScanSourceOpDesc.customDelimiter = Some(",") csvScanSourceOpDesc.hasHeader = true - csvScanSourceOpDesc.setContext(workflowContext) csvScanSourceOpDesc.setFileUri(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) val inferredSchema: Schema = csvScanSourceOpDesc.inferSchema() @@ -77,7 +72,6 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { csvScanSourceOpDesc.fileName = Some(TestOperators.CountrySalesHeaderlessSmallCsvPath) csvScanSourceOpDesc.customDelimiter = Some(",") csvScanSourceOpDesc.hasHeader = false - csvScanSourceOpDesc.setContext(workflowContext) csvScanSourceOpDesc.setFileUri(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) val inferredSchema: Schema = csvScanSourceOpDesc.inferSchema() @@ -93,7 +87,6 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { Some(TestOperators.CountrySalesSmallMultiLineCustomDelimiterCsvPath) csvScanSourceOpDesc.customDelimiter = Some(";") csvScanSourceOpDesc.hasHeader = false - csvScanSourceOpDesc.setContext(workflowContext) csvScanSourceOpDesc.setFileUri(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) val inferredSchema: Schema = csvScanSourceOpDesc.inferSchema() @@ -109,7 +102,6 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { Some(TestOperators.CountrySalesSmallMultiLineCustomDelimiterCsvPath) csvScanSourceOpDesc.customDelimiter = Some(";") csvScanSourceOpDesc.hasHeader = false - csvScanSourceOpDesc.setContext(workflowContext) csvScanSourceOpDesc.setFileUri(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) assert( From 14572dd9730446c8eb6f3dc18f58dcac035d7dea Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Sun, 29 Dec 2024 20:17:23 -0800 Subject: [PATCH 15/47] Remove cache checker in logical plan (#3178) The cache logic will be rewritten at the physical plan layer, using ports as the caching unit. This version is being removed temporarily. --- .../web/service/WorkflowCacheChecker.scala | 105 ------------------ .../uci/ics/texera/workflow/LogicalPlan.scala | 9 -- 2 files changed, 114 deletions(-) delete mode 100644 core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowCacheChecker.scala diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowCacheChecker.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowCacheChecker.scala deleted file mode 100644 index 0c4b9f54c35..00000000000 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowCacheChecker.scala +++ /dev/null @@ -1,105 +0,0 @@ -package edu.uci.ics.texera.web.service - -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.texera.web.model.websocket.request.EditingTimeCompilationRequest -import edu.uci.ics.texera.workflow.LogicalPlan - -import scala.collection.mutable - -object WorkflowCacheChecker { - - def handleCacheStatusUpdate( - oldPlan: Option[LogicalPlan], - newPlan: LogicalPlan, - request: EditingTimeCompilationRequest - ): Map[String, String] = { - val validCacheOps = new WorkflowCacheChecker(oldPlan, newPlan).getValidCacheReuse - val cacheUpdateResult = request.opsToReuseResult - .map(idString => OperatorIdentity(idString)) - .map(opId => (opId.id, if (validCacheOps.contains(opId)) "cache valid" else "cache invalid")) - .toMap - cacheUpdateResult - } - -} - -class WorkflowCacheChecker(oldWorkflowOpt: Option[LogicalPlan], newWorkflow: LogicalPlan) { - - private val equivalenceClass = new mutable.HashMap[String, Int]() - private var nextClassId: Int = 0 - - private def getNextClassId: Int = { - nextClassId += 1 - nextClassId - } - - // checks the validity of the cache given the old plan and the new plan - // returns a set of operator IDs that can be reused - // the operatorId is also the storage key - def getValidCacheReuse: Set[OperatorIdentity] = { - if (oldWorkflowOpt.isEmpty) { - return Set() - } - - val oldWorkflow = oldWorkflowOpt.get - // for each operator in the old workflow, add it to its own equivalence class - oldWorkflow.getTopologicalOpIds - .forEachRemaining(opId => { - val oldId = "old-" + opId - equivalenceClass.put(oldId, nextClassId) - nextClassId += 1 - }) - - // for each operator in the new workflow - // check if - // 1: an operator with the same content can be found in the old workflow, and - // 2: the input operators are also in the same equivalence class - // - // if both conditions are met, then the two operators are equal, - // else a new equivalence class is created - newWorkflow.getTopologicalOpIds - .forEachRemaining(opId => { - val newOp = newWorkflow.getOperator(opId) - val newOpUpstreamClasses = newWorkflow - .getUpstreamOps(opId) - .map(op => equivalenceClass("new-" + op.operatorIdentifier)) - val oldOp = oldWorkflow.operators.find(op => op.equals(newOp)).orNull - - // check if the old workflow contains the same operator content - val newOpClassId = if (oldOp == null) { - getNextClassId // operator not found, create a new class - } else { - // check its inputs are all in the same equivalence class - val oldId = "old-" + oldOp.operatorIdentifier - val oldOpUpstreamClasses = oldWorkflow - .getUpstreamOps(oldOp.operatorIdentifier) - .map(op => equivalenceClass("old-" + op.operatorIdentifier)) - if (oldOpUpstreamClasses.equals(newOpUpstreamClasses)) { - equivalenceClass(oldId) // same equivalence class - } else { - getNextClassId // inputs are no the same, new class - } - } - equivalenceClass.put("new-" + opId, newOpClassId) - }) - - // for each cached operator in the old workflow, - // check if it can be still used in the new workflow - oldWorkflow.getTerminalOperatorIds - .map(sinkOpId => { - val opId = oldWorkflow.getUpstreamOps(sinkOpId).head.operatorIdentifier - val oldCachedOpId = "old-" + opId - // find its equivalence class - val oldClassId = equivalenceClass(oldCachedOpId) - // find the corresponding operator that can still use this cache - val newOpId = equivalenceClass - .find(p => p._2 == oldClassId && p._1 != oldCachedOpId) - .map(p => p._1) - .orNull - if (newOpId == null) null else opId - }) - .filter(o => o != null) - .toSet - } - -} diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala index 46bbb441cbe..f9475fd6bb9 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala @@ -11,7 +11,6 @@ import org.jgrapht.util.SupplierUtil import java.util import scala.collection.mutable.ArrayBuffer -import scala.jdk.CollectionConverters.SetHasAsScala import scala.util.{Failure, Success, Try} object LogicalPlan { @@ -65,14 +64,6 @@ case class LogicalPlan( .filter(op => jgraphtDag.outDegreeOf(op) == 0) .toList - def getUpstreamOps(opId: OperatorIdentity): List[LogicalOp] = { - jgraphtDag - .incomingEdgesOf(opId) - .asScala - .map(e => operatorMap(e.fromOpId)) - .toList - } - def getUpstreamLinks(opId: OperatorIdentity): List[LogicalLink] = { links.filter(l => l.toOpId == opId) } From 5b622e320725d75d5e18c29711339bb7bf2ce0e3 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Sun, 29 Dec 2024 22:23:37 -0800 Subject: [PATCH 16/47] Support multiple output ports with storage (#3175) Previously, all results were tied to logical operators. This PR modifies the engine to associate results with output ports, enabling better granularity and support for operators with multiple outputs. ### Key Change: StorageKey with PortIdentity The most significant update in this PR is the adjustment of the storage key format to include both the logical operator ID and the port ID. This ensures that logical operators with multiple output ports (e.g., Split) can have distinct storages created for each output port. For now, the frontend retrieves results from the default output port (port 0). In future updates, the frontend will be enhanced to support retrieving results from additional output ports, providing more flexibility in how results are accessed and displayed. --- .../messaginglayer/OutputManager.scala | 2 +- .../scheduling/ScheduleGenerator.scala | 92 +++-------- .../web/service/ExecutionResultService.scala | 45 +++-- .../web/service/ResultExportService.scala | 8 +- .../texera/workflow/WorkflowCompiler.scala | 104 ++++++------ .../amber/engine/e2e/DataProcessingSpec.scala | 12 +- .../core/storage/result/MongoDocument.scala | 23 +-- .../core/storage/result/OpResultStorage.scala | 154 ++++++++++++------ .../operator/SpecialPhysicalOpFactory.scala | 34 +++- .../sink/managed/ProgressiveSinkOpExec.scala | 2 +- .../source/cache/CacheSourceOpExec.scala | 8 +- 11 files changed, 257 insertions(+), 227 deletions(-) diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManager.scala index 18f7c4974b4..d83eb0c1a57 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManager.scala @@ -192,7 +192,7 @@ class OutputManager( } def getSingleOutputPortIdentity: PortIdentity = { - assert(ports.size == 1) + assert(ports.size == 1, "expect 1 output port, got " + ports.size) ports.head._1 } diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala index 121be2289b1..5728f197dbc 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala @@ -1,22 +1,15 @@ package edu.uci.ics.amber.engine.architecture.scheduling -import edu.uci.ics.amber.core.executor.OpExecInitInfo import edu.uci.ics.amber.core.storage.result.{OpResultStorage, ResultStorage} -import edu.uci.ics.amber.core.workflow.{ - PhysicalOp, - PhysicalPlan, - SchemaPropagationFunc, - WorkflowContext -} +import edu.uci.ics.amber.core.workflow.{PhysicalOp, PhysicalPlan, WorkflowContext} import edu.uci.ics.amber.engine.architecture.scheduling.ScheduleGenerator.replaceVertex import edu.uci.ics.amber.engine.architecture.scheduling.resourcePolicies.{ DefaultResourceAllocator, ExecutionClusterInfo } import edu.uci.ics.amber.operator.SpecialPhysicalOpFactory -import edu.uci.ics.amber.operator.source.cache.CacheSourceOpExec -import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, PhysicalOpIdentity} -import edu.uci.ics.amber.workflow.{OutputPort, PhysicalLink} +import edu.uci.ics.amber.virtualidentity.PhysicalOpIdentity +import edu.uci.ics.amber.workflow.PhysicalLink import org.jgrapht.graph.DirectedAcyclicGraph import org.jgrapht.traverse.TopologicalOrderIterator @@ -119,9 +112,9 @@ abstract class ScheduleGenerator( physicalPlan .getLinksBetween(upstreamPhysicalOpId, physicalOpId) .filter(link => - !physicalPlan.getOperator(physicalOpId).isSinkOperator && (physicalPlan + !physicalPlan.getOperator(physicalOpId).isSinkOperator && physicalPlan .getOperator(physicalOpId) - .isInputLinkDependee(link)) + .isInputLinkDependee(link) ) } } @@ -158,7 +151,19 @@ abstract class ScheduleGenerator( .removeLink(physicalLink) // create cache writer and link - val matWriterPhysicalOp: PhysicalOp = createMatWriter(physicalLink) + val storageKey = OpResultStorage.createStorageKey( + physicalLink.fromOpId.logicalOpId, + physicalLink.fromPortId, + isMaterialized = true + ) + val fromPortOutputMode = + physicalPlan.getOperator(physicalLink.fromOpId).outputPorts(physicalLink.fromPortId)._1.mode + val matWriterPhysicalOp: PhysicalOp = SpecialPhysicalOpFactory.newSinkPhysicalOp( + workflowContext.workflowId, + workflowContext.executionId, + storageKey, + fromPortOutputMode + ) val sourceToWriterLink = PhysicalLink( fromOp.id, @@ -170,7 +175,7 @@ abstract class ScheduleGenerator( .addOperator(matWriterPhysicalOp) .addLink(sourceToWriterLink) - // expect exactly one input port and one output port + // sink has exactly one input port and one output port val schema = newPhysicalPlan .getOperator(matWriterPhysicalOp.id) .outputPorts(matWriterPhysicalOp.outputPorts.keys.head) @@ -180,14 +185,17 @@ abstract class ScheduleGenerator( ResultStorage .getOpResultStorage(workflowContext.workflowId) .create( - key = matWriterPhysicalOp.id.logicalOpId, + key = storageKey, mode = OpResultStorage.defaultStorageMode, - schema = Some(schema) + schema = schema ) // create cache reader and link - val matReaderPhysicalOp: PhysicalOp = - createMatReader(matWriterPhysicalOp.id.logicalOpId, physicalLink) + val matReaderPhysicalOp: PhysicalOp = SpecialPhysicalOpFactory.newSourcePhysicalOp( + workflowContext.workflowId, + workflowContext.executionId, + storageKey + ) val readerToDestLink = PhysicalLink( matReaderPhysicalOp.id, @@ -201,52 +209,4 @@ abstract class ScheduleGenerator( .addOperator(matReaderPhysicalOp) .addLink(readerToDestLink) } - - private def createMatReader( - matWriterLogicalOpId: OperatorIdentity, - physicalLink: PhysicalLink - ): PhysicalOp = { - val opResultStorage = ResultStorage.getOpResultStorage(workflowContext.workflowId) - PhysicalOp - .sourcePhysicalOp( - workflowContext.workflowId, - workflowContext.executionId, - OperatorIdentity(s"cacheSource_${getMatIdFromPhysicalLink(physicalLink)}"), - OpExecInitInfo((_, _) => - new CacheSourceOpExec( - opResultStorage.get(matWriterLogicalOpId) - ) - ) - ) - .withInputPorts(List.empty) - .withOutputPorts(List(OutputPort())) - .withPropagateSchema( - SchemaPropagationFunc(_ => - Map( - OutputPort().id -> opResultStorage.getSchema(matWriterLogicalOpId).get - ) - ) - ) - .propagateSchema() - - } - - private def createMatWriter(physicalLink: PhysicalLink): PhysicalOp = { - val outputMode = - physicalPlan.getOperator(physicalLink.fromOpId).outputPorts(physicalLink.fromPortId)._1.mode - val storageKey = s"materialized_${getMatIdFromPhysicalLink(physicalLink)}" - SpecialPhysicalOpFactory.newSinkPhysicalOp( - workflowContext.workflowId, - workflowContext.executionId, - storageKey, - outputMode - ) - } - - private def getMatIdFromPhysicalLink(physicalLink: PhysicalLink) = - s"${physicalLink.fromOpId.logicalOpId}_${physicalLink.fromOpId.layerName}_" + - s"${physicalLink.fromPortId.id}_" + - s"${physicalLink.toOpId.logicalOpId}_${physicalLink.toOpId.layerName}_" + - s"${physicalLink.toPortId.id}" - } diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala index a0b09ff3fd1..70a21bba475 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala @@ -5,12 +5,8 @@ import com.fasterxml.jackson.annotation.{JsonTypeInfo, JsonTypeName} import com.fasterxml.jackson.databind.node.ObjectNode import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.storage.StorageConfig -import edu.uci.ics.amber.core.storage.result.{ - MongoDocument, - OperatorResultMetadata, - ResultStorage, - WorkflowResultStore -} +import edu.uci.ics.amber.core.storage.result.OpResultStorage.MONGODB +import edu.uci.ics.amber.core.storage.result._ import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.core.workflow.{PhysicalOp, PhysicalPlan} import edu.uci.ics.amber.engine.architecture.controller.{ExecutionStateUpdate, FatalError} @@ -25,6 +21,7 @@ import edu.uci.ics.amber.engine.common.executionruntimestate.ExecutionMetadataSt import edu.uci.ics.amber.engine.common.{AmberConfig, AmberRuntime} import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, WorkflowIdentity} import edu.uci.ics.amber.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.workflow.PortIdentity import edu.uci.ics.texera.web.SubscriptionManager import edu.uci.ics.texera.web.model.websocket.event.{ PaginatedResultEvent, @@ -89,7 +86,9 @@ object ExecutionResultService { } val storage = - ResultStorage.getOpResultStorage(workflowIdentity).get(physicalOps.head.id.logicalOpId) + ResultStorage + .getOpResultStorage(workflowIdentity) + .get(OpResultStorage.createStorageKey(physicalOps.head.id.logicalOpId, PortIdentity())) val webUpdate = webOutputMode match { case PaginationMode() => val numTuples = storage.getCount @@ -238,10 +237,12 @@ class ExecutionResultService( oldInfo.tupleCount, info.tupleCount ) - if (StorageConfig.resultStorageMode.toLowerCase == "mongodb") { + if (StorageConfig.resultStorageMode == MONGODB) { + // using the first port for now. TODO: support multiple ports + val storageKey = OpResultStorage.createStorageKey(opId, PortIdentity()) val opStorage = ResultStorage .getOpResultStorage(workflowIdentity) - .get(physicalPlan.getPhysicalOpsOfLogicalOp(opId).head.id.logicalOpId) + .get(storageKey) opStorage match { case mongoDocument: MongoDocument[Tuple] => val tableCatStats = mongoDocument.getCategoricalStats @@ -277,15 +278,16 @@ class ExecutionResultService( def handleResultPagination(request: ResultPaginationRequest): TexeraWebSocketEvent = { // calculate from index (pageIndex starts from 1 instead of 0) val from = request.pageSize * (request.pageIndex - 1) - val opId = OperatorIdentity(request.operatorID) - val paginationIterable = { + // using the first port for now. TODO: support multiple ports + val storageKey = + OpResultStorage.createStorageKey(OperatorIdentity(request.operatorID), PortIdentity()) + val paginationIterable = { ResultStorage .getOpResultStorage(workflowIdentity) - .get(opId) + .get(storageKey) .getRange(from, from + request.pageSize) .to(Iterable) - } val mappedResults = paginationIterable .map(tuple => tuple.asKeyValuePairJson()) @@ -302,7 +304,7 @@ class ExecutionResultService( ResultStorage .getOpResultStorage(workflowIdentity) .getAllKeys - .filter(!_.id.startsWith("materialized_")) + .filter(!_.startsWith("materialized_")) .map(storageKey => { val count = ResultStorage .getOpResultStorage(workflowIdentity) @@ -310,20 +312,15 @@ class ExecutionResultService( .getCount .toInt - val opId = storageKey + val (opId, storagePortId) = OpResultStorage.decodeStorageKey(storageKey) - // use the first output port's mode + // Retrieve the mode of the specified output port val mode = physicalPlan .getPhysicalOpsOfLogicalOp(opId) - .flatMap(physicalOp => physicalOp.outputPorts) - .filter({ - case (portId, (port, links, schema)) => - !portId.internal - }) - .map({ - case (portId, (port, links, schema)) => port.mode - }) + .flatMap(_.outputPorts.get(storagePortId)) + .map(_._1.mode) .head + val changeDetector = if (mode == OutputMode.SET_SNAPSHOT) { UUID.randomUUID.toString diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala index c2a0693bd85..83e3d89f347 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala @@ -24,6 +24,7 @@ import edu.uci.ics.texera.web.resource.dashboard.user.dataset.DatasetResource.{ import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowVersionResource import org.jooq.types.UInteger import edu.uci.ics.amber.util.ArrowUtils +import edu.uci.ics.amber.workflow.PortIdentity import java.io.{PipedInputStream, PipedOutputStream} import java.nio.charset.StandardCharsets @@ -71,8 +72,11 @@ class ResultExportService(workflowIdentity: WorkflowIdentity) { } // By now the workflow should finish running + // Only supports external port 0 for now. TODO: support multiple ports val operatorResult: VirtualDocument[Tuple] = - ResultStorage.getOpResultStorage(workflowIdentity).get(OperatorIdentity(request.operatorId)) + ResultStorage + .getOpResultStorage(workflowIdentity) + .get(OpResultStorage.createStorageKey(OperatorIdentity(request.operatorId), PortIdentity())) if (operatorResult == null) { return ResultExportResponse("error", "The workflow contains no results") } @@ -190,7 +194,7 @@ class ResultExportService(workflowIdentity: WorkflowIdentity) { val columnIndex = request.columnIndex val filename = request.filename - if (rowIndex >= results.size || columnIndex >= results.head.getFields.size) { + if (rowIndex >= results.size || columnIndex >= results.head.getFields.length) { return ResultExportResponse("error", s"Invalid row or column index") } diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala index 615cc937dbf..795939f1ea0 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala @@ -2,7 +2,6 @@ package edu.uci.ics.texera.workflow import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.storage.result.{OpResultStorage, ResultStorage} -import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalPlan, WorkflowContext} import edu.uci.ics.amber.engine.architecture.controller.Workflow import edu.uci.ics.amber.engine.common.Utils.objectMapper @@ -70,62 +69,59 @@ class WorkflowCompiler( // assign the sinks to toAddSink operators' external output ports subPlan .topologicalIterator() + .filter(opId => toAddSink.contains(opId.logicalOpId)) .map(physicalPlan.getOperator) - .flatMap { physicalOp => - physicalOp.outputPorts.map(outputPort => (physicalOp, outputPort)) - } - .filter({ - case (physicalOp, (_, (outputPort, _, _))) => - toAddSink.contains(physicalOp.id.logicalOpId) && !outputPort.id.internal - }) - .foreach({ - case (physicalOp, (_, (outputPort, _, schema))) => - val storage = ResultStorage.getOpResultStorage(context.workflowId) - val storageKey = physicalOp.id.logicalOpId - - // due to the size limit of single document in mongoDB (16MB) - // for sinks visualizing HTMLs which could possibly be large in size, we always use the memory storage. - val storageType = { - if (outputPort.mode == SINGLE_SNAPSHOT) OpResultStorage.MEMORY - else OpResultStorage.defaultStorageMode - } - if (!storage.contains(storageKey)) { - // get the schema for result storage in certain mode - val sinkStorageSchema: Option[Schema] = - if (storageType == OpResultStorage.MONGODB) { - // use the output schema on the first output port as the schema for storage - Some(schema.right.get) - } else { - None + .foreach { physicalOp => + physicalOp.outputPorts + .filterNot(_._1.internal) + .foreach { + case (outputPortId, (outputPort, _, schema)) => + val storage = ResultStorage.getOpResultStorage(context.workflowId) + val storageKey = + OpResultStorage.createStorageKey(physicalOp.id.logicalOpId, outputPortId) + + // Determine the storage type, defaulting to memory for large HTML visualizations + val storageType = + if (outputPort.mode == SINGLE_SNAPSHOT) OpResultStorage.MEMORY + else OpResultStorage.defaultStorageMode + + if (!storage.contains(storageKey)) { + // Create storage if it doesn't exist + val sinkStorageSchema = + schema.getOrElse(throw new IllegalStateException("Schema is missing")) + storage.create( + s"${context.executionId}_", + storageKey, + storageType, + sinkStorageSchema + ) + + // Add sink collection name to the JSON array of sinks + sinksPointers.add( + objectMapper + .createObjectNode() + .put("storageType", storageType) + .put("storageKey", s"${context.executionId}_$storageKey") + ) } - storage.create( - s"${context.executionId}_", - storageKey, - storageType, - sinkStorageSchema - ) - // add the sink collection name to the JSON array of sinks - val storageNode = objectMapper.createObjectNode() - storageNode.put("storageType", storageType) - storageNode.put("storageKey", s"${context.executionId}_$storageKey") - sinksPointers.add(storageNode) - } - val sinkPhysicalOp = SpecialPhysicalOpFactory.newSinkPhysicalOp( - context.workflowId, - context.executionId, - storageKey.id, - outputPort.mode - ) - val sinkLink = - PhysicalLink( - physicalOp.id, - outputPort.id, - sinkPhysicalOp.id, - PortIdentity(internal = true) - ) - physicalPlan = physicalPlan.addOperator(sinkPhysicalOp).addLink(sinkLink) - }) + // Create and link the sink operator + val sinkPhysicalOp = SpecialPhysicalOpFactory.newSinkPhysicalOp( + context.workflowId, + context.executionId, + storageKey, + outputPort.mode + ) + val sinkLink = PhysicalLink( + physicalOp.id, + outputPort.id, + sinkPhysicalOp.id, + sinkPhysicalOp.outputPorts.head._1 + ) + + physicalPlan = physicalPlan.addOperator(sinkPhysicalOp).addLink(sinkLink) + } + } } match { case Success(_) => diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala index e56e0973e15..2f6f8ab67d5 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala @@ -69,8 +69,16 @@ class DataProcessingSpec .registerCallback[ExecutionStateUpdate](evt => { if (evt.state == COMPLETED) { results = workflow.logicalPlan.getTerminalOperatorIds - .filter(terminalOpId => resultStorage.contains(terminalOpId)) - .map(terminalOpId => terminalOpId -> resultStorage.get(terminalOpId).get().toList) + .filter(terminalOpId => + // expecting the first output port only. + resultStorage.contains(OpResultStorage.createStorageKey(terminalOpId, PortIdentity())) + ) + .map(terminalOpId => + terminalOpId -> resultStorage + .get(OpResultStorage.createStorageKey(terminalOpId, PortIdentity())) + .get() + .toList + ) .toMap completion.setDone() } diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/MongoDocument.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/MongoDocument.scala index 18baa1844fb..92fa1cdce11 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/MongoDocument.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/MongoDocument.scala @@ -21,7 +21,7 @@ import java.util.Date class MongoDocument[T >: Null <: AnyRef]( id: String, var toDocument: T => Document, - var fromDocument: Option[Document => T] = None + var fromDocument: Document => T ) extends VirtualDocument[T] { /** @@ -74,9 +74,7 @@ class MongoDocument[T >: Null <: AnyRef]( override def hasNext: Boolean = cursor.hasNext override def next(): T = { - val fromDocumentFunc = - fromDocument.getOrElse(throw new NotImplementedError("fromDocument is not set")) - fromDocumentFunc(cursor.next()) + fromDocument(cursor.next()) } }.iterator } @@ -133,9 +131,7 @@ class MongoDocument[T >: Null <: AnyRef]( if (!cursor.hasNext) { throw new RuntimeException(f"Index $i out of bounds") } - val fromDocumentFunc = - fromDocument.getOrElse(throw new NotImplementedError("fromDocument is not set")) - fromDocumentFunc(cursor.next()) + fromDocument(cursor.next()) } /** @@ -146,19 +142,6 @@ class MongoDocument[T >: Null <: AnyRef]( collectionMgr.getCount } - /** - * Set the deserializer, i.e. from Document to T. This method can only be called once. - * - * @param fromDocument : the deserializer, convert MongoDB's Document to T. - * @throws IllegalStateException if setSerde is called more than once. - */ - def setDeserde(fromDocument: Document => T): Unit = { - if (this.fromDocument.isDefined) { - throw new IllegalStateException("setSerde can only be called once.") - } - this.fromDocument = Some(fromDocument) - } - def getNumericColStats: Map[String, Map[String, Double]] = collectionMgr.calculateNumericStats() diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala index ab69d6f94d2..42728231270 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala @@ -5,103 +5,153 @@ import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.amber.core.storage.model.VirtualDocument import edu.uci.ics.amber.core.tuple.{Schema, Tuple} import edu.uci.ics.amber.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.workflow.PortIdentity import java.util.concurrent.ConcurrentHashMap -import scala.collection.convert.ImplicitConversions.`iterator asScala` +import scala.jdk.CollectionConverters.IteratorHasAsScala +/** + * Companion object for `OpResultStorage`, providing utility functions + * for key generation, decoding, and storage modes. + */ object OpResultStorage { val defaultStorageMode: String = StorageConfig.resultStorageMode.toLowerCase - val MEMORY = "memory" - val MONGODB = "mongodb" + val MEMORY: String = "memory" + val MONGODB: String = "mongodb" + + /** + * Creates a unique storage key by combining operator and port identities. + * + * @param operatorId The unique identifier of the operator. + * @param portIdentity The unique identifier of the port. + * @param isMaterialized Indicates whether the storage is materialized (e.g., persisted). + * @return A string representing the generated storage key, formatted as: + * "materialized___" if materialized, + * otherwise "__". + */ + def createStorageKey( + operatorId: OperatorIdentity, + portIdentity: PortIdentity, + isMaterialized: Boolean = false + ): String = { + val prefix = if (isMaterialized) "materialized_" else "" + s"$prefix${operatorId.id}_${portIdentity.id}_${portIdentity.internal}" + } + + /** + * Decodes a storage key back into its original components. + * + * @param key The storage key to decode. + * @return A tuple containing the operator identity and port identity. + * @throws IllegalArgumentException If the key format is invalid. + */ + def decodeStorageKey(key: String): (OperatorIdentity, PortIdentity) = { + val processedKey = if (key.startsWith("materialized_")) key.substring(13) else key + processedKey.split("_", 3) match { + case Array(opId, portId, internal) => + (OperatorIdentity(opId), PortIdentity(portId.toInt, internal.toBoolean)) + case _ => + throw new IllegalArgumentException(s"Invalid storage key: $key") + } + } } /** - * Public class of operator result storage. - * One execution links one instance of OpResultStorage, both have the same lifecycle. + * Handles the storage of operator results during workflow execution. + * Each `OpResultStorage` instance is tied to the lifecycle of a single execution. */ class OpResultStorage extends Serializable with LazyLogging { - // since some op need to get the schema from the OpResultStorage, the schema is stored as part of the OpResultStorage.cache - // TODO: once we make the storage self-contained, i.e. storing Schema in the storage as metadata, we can remove it - val cache: ConcurrentHashMap[OperatorIdentity, (VirtualDocument[Tuple], Option[Schema])] = - new ConcurrentHashMap[OperatorIdentity, (VirtualDocument[Tuple], Option[Schema])]() + /** + * In-memory cache for storing results and their associated schemas. + * TODO: Once the storage is self-contained (i.e., stores schemas as metadata), + * this can be removed. + */ + private val cache: ConcurrentHashMap[String, (VirtualDocument[Tuple], Schema)] = + new ConcurrentHashMap() /** - * Retrieve the result of an operator from OpResultStorage - * @param key The key used for storage and retrieval. - * Currently it is the uuid inside the cache source or cache sink operator. - * @return The storage object of this operator. + * Retrieves the result of an operator from the storage. + * + * @param key The storage key associated with the result. + * @return The result stored as a `VirtualDocument[Tuple]`. + * @throws NoSuchElementException If the key is not found in the cache. */ - def get(key: OperatorIdentity): VirtualDocument[Tuple] = { - cache.get(key)._1 + def get(key: String): VirtualDocument[Tuple] = { + Option(cache.get(key)) match { + case Some((document, _)) => document + case None => throw new NoSuchElementException(s"Storage with key $key not found") + } } /** - * Retrieve the schema of the result associate with target operator - * @param key the uuid inside the cache source or cache sink operator. - * @return The result schema of this operator. + * Retrieves the schema associated with an operator's result. + * + * @param key The storage key associated with the schema. + * @return The schema of the result. */ - def getSchema(key: OperatorIdentity): Option[Schema] = { + def getSchema(key: String): Schema = { cache.get(key)._2 } - def setSchema(key: OperatorIdentity, schema: Schema): Unit = { - val storage = get(key) - cache.put(key, (storage, Some(schema))) - } - + /** + * Creates a new storage object for an operator result. + * + * @param executionId An optional execution ID for unique identification. + * @param key The storage key for the result. + * @param mode The storage mode (e.g., "memory" or "mongodb"). + * @param schema The schema of the result. + * @return A `VirtualDocument[Tuple]` instance for storing results. + */ def create( executionId: String = "", - key: OperatorIdentity, + key: String, mode: String, - schema: Option[Schema] = None + schema: Schema ): VirtualDocument[Tuple] = { - val storage: VirtualDocument[Tuple] = - if (mode == "memory") { - new MemoryDocument[Tuple](key.id) + if (mode == OpResultStorage.MEMORY) { + new MemoryDocument[Tuple](key) } else { try { - val fromDocument = schema.map(Tuple.fromDocument) - new MongoDocument[Tuple](executionId + key, Tuple.toDocument, fromDocument) + new MongoDocument[Tuple]( + executionId + key, + Tuple.toDocument, + Tuple.fromDocument(schema) + ) } catch { case t: Throwable => - logger.warn("Failed to create mongo storage", t) - logger.info(s"Fall back to memory storage for $key") - // fall back to memory - new MemoryDocument[Tuple](key.id) + logger.warn("Failed to create MongoDB storage", t) + logger.info(s"Falling back to memory storage for $key") + new MemoryDocument[Tuple](key) } } cache.put(key, (storage, schema)) storage } - def contains(key: OperatorIdentity): Boolean = { - cache.containsKey(key) - } - /** - * Manually remove an entry from the cache. - * @param key The key used for storage and retrieval. - * Currently it is the uuid inside the cache source or cache sink operator. + * Checks if a storage key exists in the cache. + * + * @param key The storage key to check. + * @return True if the key exists, false otherwise. */ - def remove(key: OperatorIdentity): Unit = { - if (cache.contains(key)) { - cache.get(key)._1.clear() - } - cache.remove(key) - } + def contains(key: String): Boolean = cache.containsKey(key) /** - * Close this storage. Used for workflow cleanup. + * Clears all stored results. Typically used during workflow cleanup. */ def clear(): Unit = { cache.forEach((_, document) => document._1.clear()) cache.clear() } - def getAllKeys: Set[OperatorIdentity] = { - cache.keySet().iterator().toSet + /** + * Retrieves all storage keys currently in the cache. + * + * @return A set of all keys in the cache. + */ + def getAllKeys: Set[String] = { + cache.keySet().iterator().asScala.toSet } - } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala index 0024993166f..64028a3fef8 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala @@ -1,10 +1,12 @@ package edu.uci.ics.amber.operator import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.storage.result.{OpResultStorage, ResultStorage} import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.sink.ProgressiveUtils import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpExec +import edu.uci.ics.amber.operator.source.cache.CacheSourceOpExec import edu.uci.ics.amber.virtualidentity.{ ExecutionIdentity, OperatorIdentity, @@ -21,10 +23,11 @@ object SpecialPhysicalOpFactory { executionIdentity: ExecutionIdentity, storageKey: String, outputMode: OutputMode - ): PhysicalOp = + ): PhysicalOp = { + val (opId, portId) = OpResultStorage.decodeStorageKey(storageKey) PhysicalOp .localPhysicalOp( - PhysicalOpIdentity(OperatorIdentity(storageKey), "sink"), + PhysicalOpIdentity(opId, s"sink${portId.id}"), workflowIdentity, executionIdentity, OpExecInitInfo((idx, workers) => @@ -68,4 +71,31 @@ object SpecialPhysicalOpFactory { Map(PortIdentity(internal = true) -> outputSchema) }) ) + } + + def newSourcePhysicalOp( + workflowIdentity: WorkflowIdentity, + executionIdentity: ExecutionIdentity, + storageKey: String + ): PhysicalOp = { + + val (opId, portId) = OpResultStorage.decodeStorageKey(storageKey) + val opResultStorage = ResultStorage.getOpResultStorage(workflowIdentity) + val outputPort = OutputPort() + PhysicalOp + .sourcePhysicalOp( + PhysicalOpIdentity(opId, s"source${portId.id}"), + workflowIdentity, + executionIdentity, + OpExecInitInfo((_, _) => new CacheSourceOpExec(storageKey, workflowIdentity)) + ) + .withInputPorts(List.empty) + .withOutputPorts(List(outputPort)) + .withPropagateSchema( + SchemaPropagationFunc(_ => Map(outputPort.id -> opResultStorage.getSchema(storageKey))) + ) + .propagateSchema() + + } + } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala index aaeababd685..7290e064159 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala @@ -15,7 +15,7 @@ class ProgressiveSinkOpExec( workflowIdentity: WorkflowIdentity ) extends SinkOperatorExecutor { val writer: BufferedItemWriter[Tuple] = - ResultStorage.getOpResultStorage(workflowIdentity).get(OperatorIdentity(storageKey)).writer() + ResultStorage.getOpResultStorage(workflowIdentity).get(storageKey).writer() override def open(): Unit = { writer.open() diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpExec.scala index 2fa7772d3dc..ba335d1d5c9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpExec.scala @@ -2,12 +2,14 @@ package edu.uci.ics.amber.operator.source.cache import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.executor.SourceOperatorExecutor -import edu.uci.ics.amber.core.storage.model.VirtualDocument -import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} +import edu.uci.ics.amber.core.storage.result.ResultStorage +import edu.uci.ics.amber.core.tuple.TupleLike +import edu.uci.ics.amber.virtualidentity.WorkflowIdentity -class CacheSourceOpExec(storage: VirtualDocument[Tuple]) +class CacheSourceOpExec(storageKey: String, workflowIdentity: WorkflowIdentity) extends SourceOperatorExecutor with LazyLogging { + private val storage = ResultStorage.getOpResultStorage(workflowIdentity).get(storageKey) override def produceTuple(): Iterator[TupleLike] = storage.get() From 04145440f7e8addf68752ad558e96e09924aa549 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Sun, 29 Dec 2024 23:52:22 -0800 Subject: [PATCH 17/47] Convert Java operator descriptors to Scala (#3179) There are a few operator descriptors written in Java, which makes them difficult to use and maintain. This PR converts all such descriptors to Scala to streamline the migration process to new APIs and facilitate future work, such as operator offloading. Changed Operators: - PythonUDFSourceOpDescV2 - RUDFSourceOpDesc - SentimentAnalysisOpDesc - SpecializedFilterOpDesc - TypeCastingOpDesc --- .../WorkflowCompilationResourceSpec.scala | 6 +- .../amber/operator/filter/FilterOpExec.scala | 4 +- .../filter/SpecializedFilterOpDesc.java | 58 -------- .../filter/SpecializedFilterOpDesc.scala | 41 ++++++ .../filter/SpecializedFilterOpExec.java | 21 --- .../filter/SpecializedFilterOpExec.scala | 8 ++ .../sentiment/SentimentAnalysisOpExec.java | 4 +- .../sink/managed/ProgressiveSinkOpExec.scala | 7 +- .../typecasting/TypeCastingOpDesc.java | 96 ------------- .../typecasting/TypeCastingOpDesc.scala | 59 ++++++++ .../typecasting/TypeCastingOpExec.scala | 6 +- .../source/PythonUDFSourceOpDescV2.java | 120 ---------------- .../source/PythonUDFSourceOpDescV2.scala | 86 ++++++++++++ .../operator/udf/r/RUDFSourceOpDesc.java | 129 ------------------ .../operator/udf/r/RUDFSourceOpDesc.scala | 94 +++++++++++++ .../filter/SpecializedFilterOpExecSpec.scala | 19 ++- .../typecasting/TypeCastingOpExecSpec.scala | 4 +- 17 files changed, 312 insertions(+), 450 deletions(-) delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.java create mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.java create mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.scala delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.java create mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.java create mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala delete mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.java create mode 100644 core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala diff --git a/core/workflow-compiling-service/src/test/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResourceSpec.scala b/core/workflow-compiling-service/src/test/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResourceSpec.scala index 058aa443d59..669df1e6e60 100644 --- a/core/workflow-compiling-service/src/test/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResourceSpec.scala +++ b/core/workflow-compiling-service/src/test/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResourceSpec.scala @@ -70,7 +70,7 @@ class WorkflowCompilationResourceSpec extends AnyFlatSpec with BeforeAndAfterAll // utility function to create a filter op private def getFilterOpDesc( - filterPredicates: java.util.List[FilterPredicate] + filterPredicates: List[FilterPredicate] ): FilterOpDesc = { val filterOpDesc = new SpecializedFilterOpDesc filterOpDesc.predicates = filterPredicates @@ -116,11 +116,11 @@ class WorkflowCompilationResourceSpec extends AnyFlatSpec with BeforeAndAfterAll // Create the filter predicate for TotalProfit > 10000 val filterPredicate1 = new FilterPredicate("Total Profit", ComparisonType.GREATER_THAN, "10000") - val filterOpDesc1 = getFilterOpDesc(java.util.Arrays.asList(filterPredicate1)) + val filterOpDesc1 = getFilterOpDesc(List(filterPredicate1)) // Create the filter predicate for Region != "JPN" val filterPredicate2 = new FilterPredicate("Region", ComparisonType.NOT_EQUAL_TO, "JPN") - val filterOpDesc2 = getFilterOpDesc(java.util.Arrays.asList(filterPredicate2)) + val filterOpDesc2 = getFilterOpDesc(List(filterPredicate2)) // Add a second limit operation val limitOpDesc2 = getLimitOpDesc(5) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpExec.scala index 3b7eda12f4f..a1db9102934 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpExec.scala @@ -7,8 +7,8 @@ abstract class FilterOpExec extends OperatorExecutor with Serializable { var filterFunc: Tuple => Boolean = _ - def setFilterFunc(func: Tuple => java.lang.Boolean): Unit = - filterFunc = (tuple: Tuple) => func.apply(tuple).booleanValue() + def setFilterFunc(func: Tuple => Boolean): Unit = + filterFunc = func override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = if (filterFunc(tuple)) Iterator.single(tuple) else Iterator.empty diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.java deleted file mode 100644 index ea35f40dee0..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.java +++ /dev/null @@ -1,58 +0,0 @@ -package edu.uci.ics.amber.operator.filter; - -import com.fasterxml.jackson.annotation.JsonProperty; -import com.fasterxml.jackson.annotation.JsonPropertyDescription; -import edu.uci.ics.amber.core.executor.OpExecInitInfo; -import edu.uci.ics.amber.core.executor.OperatorExecutor; -import edu.uci.ics.amber.core.workflow.PhysicalOp; -import edu.uci.ics.amber.operator.metadata.OperatorGroupConstants; -import edu.uci.ics.amber.operator.metadata.OperatorInfo; -import edu.uci.ics.amber.virtualidentity.ExecutionIdentity; -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity; -import edu.uci.ics.amber.workflow.InputPort; -import edu.uci.ics.amber.workflow.OutputPort; -import edu.uci.ics.amber.workflow.PortIdentity; -import scala.Tuple2; - -import java.util.ArrayList; -import java.util.function.Function; - -import static java.util.Collections.singletonList; -import static scala.jdk.javaapi.CollectionConverters.asScala; - -public class SpecializedFilterOpDesc extends FilterOpDesc { - - @JsonProperty(value = "predicates", required = true) - @JsonPropertyDescription("multiple predicates in OR") - public java.util.List predicates; - - @Override - public PhysicalOp getPhysicalOp(WorkflowIdentity workflowId, ExecutionIdentity executionId) { - return PhysicalOp.oneToOnePhysicalOp( - workflowId, - executionId, - operatorIdentifier(), - OpExecInitInfo.apply( - (Function, OperatorExecutor> & java.io.Serializable) - x -> new edu.uci.ics.amber.operator.filter.SpecializedFilterOpExec(this.predicates) - ) - ) - .withInputPorts(operatorInfo().inputPorts()) - .withOutputPorts(operatorInfo().outputPorts()); - } - - @Override - public OperatorInfo operatorInfo() { - return new OperatorInfo( - "Filter", - "Performs a filter operation", - OperatorGroupConstants.CLEANING_GROUP(), - asScala(singletonList(new InputPort(new PortIdentity(0, false), "", false, asScala(new ArrayList()).toSeq()))).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), - false, - false, - true, - false - ); - } -} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala new file mode 100644 index 00000000000..770334dc396 --- /dev/null +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala @@ -0,0 +1,41 @@ +package edu.uci.ics.amber.operator.filter + +import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} +import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.workflow.PhysicalOp +import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.workflow.{InputPort, OutputPort} + +class SpecializedFilterOpDesc extends FilterOpDesc { + + @JsonProperty(value = "predicates", required = true) + @JsonPropertyDescription("multiple predicates in OR") + var predicates: List[FilterPredicate] = List.empty + + override def getPhysicalOp( + workflowId: WorkflowIdentity, + executionId: ExecutionIdentity + ): PhysicalOp = { + PhysicalOp + .oneToOnePhysicalOp( + workflowId, + executionId, + operatorIdentifier, + OpExecInitInfo((_, _) => new SpecializedFilterOpExec(predicates)) + ) + .withInputPorts(operatorInfo.inputPorts) + .withOutputPorts(operatorInfo.outputPorts) + } + + override def operatorInfo: OperatorInfo = { + OperatorInfo( + "Filter", + "Performs a filter operation", + OperatorGroupConstants.CLEANING_GROUP, + List(InputPort()), + List(OutputPort()), + supportReconfiguration = true + ) + } +} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.java deleted file mode 100644 index 0d7eb9a0136..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.java +++ /dev/null @@ -1,21 +0,0 @@ -package edu.uci.ics.amber.operator.filter; - -import edu.uci.ics.amber.core.tuple.Tuple; -import scala.Function1; - -import java.io.Serializable; - -public class SpecializedFilterOpExec extends FilterOpExec { - - private final java.util.List predicates; - - public SpecializedFilterOpExec(java.util.List predicates) { - this.predicates = predicates; - setFilterFunc((Function1 & Serializable) this::filterFunc); - } - - public Boolean filterFunc(Tuple tuple) { - return predicates.stream().anyMatch(predicate -> predicate.evaluate(tuple)); - } - -} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.scala new file mode 100644 index 00000000000..096721decac --- /dev/null +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.scala @@ -0,0 +1,8 @@ +package edu.uci.ics.amber.operator.filter + +import edu.uci.ics.amber.core.tuple.Tuple + +class SpecializedFilterOpExec(predicates: List[FilterPredicate]) extends FilterOpExec { + + setFilterFunc((tuple: Tuple) => predicates.exists(_.evaluate(tuple))) +} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpExec.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpExec.java index 4e64af8e263..038e12e461f 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpExec.java +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpExec.java @@ -17,13 +17,13 @@ public class SentimentAnalysisOpExec extends MapOpExec { private final String attributeName; - private final edu.uci.ics.amber.operator.sentiment.StanfordCoreNLPWrapper coreNlp; + private final StanfordCoreNLPWrapper coreNlp; public SentimentAnalysisOpExec(String attributeName) { this.attributeName = attributeName; Properties props = new Properties(); props.setProperty("annotators", "tokenize, ssplit, parse, sentiment"); - coreNlp = new edu.uci.ics.amber.operator.sentiment.StanfordCoreNLPWrapper(props); + coreNlp = new StanfordCoreNLPWrapper(props); this.setMapFunc((Function1 & Serializable) this::sentimentAnalysis); } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala index 7290e064159..30a604ea924 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.core.storage.model.BufferedItemWriter import edu.uci.ics.amber.core.storage.result.ResultStorage import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} import edu.uci.ics.amber.operator.sink.ProgressiveUtils -import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, WorkflowIdentity} +import edu.uci.ics.amber.virtualidentity.WorkflowIdentity import edu.uci.ics.amber.workflow.OutputPort.OutputMode import edu.uci.ics.amber.workflow.PortIdentity @@ -43,9 +43,12 @@ class ProgressiveSinkOpExec( } override def onFinishMultiPort(port: Int): Iterator[(TupleLike, Option[PortIdentity])] = { - writer.close() Iterator.empty } + override def close(): Unit = { + writer.close() + } + override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = Iterator.empty } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.java deleted file mode 100644 index 87d902f01f9..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.java +++ /dev/null @@ -1,96 +0,0 @@ -package edu.uci.ics.amber.operator.typecasting; - -import com.fasterxml.jackson.annotation.JsonProperty; -import com.fasterxml.jackson.annotation.JsonPropertyDescription; -import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle; -import edu.uci.ics.amber.core.executor.OpExecInitInfo; -import edu.uci.ics.amber.core.executor.OperatorExecutor; -import edu.uci.ics.amber.core.tuple.AttributeTypeUtils; -import edu.uci.ics.amber.core.tuple.Schema; -import edu.uci.ics.amber.core.workflow.PhysicalOp; -import edu.uci.ics.amber.core.workflow.SchemaPropagationFunc; -import edu.uci.ics.amber.operator.map.MapOpDesc; -import edu.uci.ics.amber.operator.metadata.OperatorGroupConstants; -import edu.uci.ics.amber.operator.metadata.OperatorInfo; -import edu.uci.ics.amber.operator.util.OperatorDescriptorUtils; -import edu.uci.ics.amber.virtualidentity.ExecutionIdentity; -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity; -import edu.uci.ics.amber.workflow.InputPort; -import edu.uci.ics.amber.workflow.OutputPort; -import edu.uci.ics.amber.workflow.PortIdentity; -import scala.Tuple2; -import scala.collection.immutable.Map; - -import java.io.Serializable; -import java.util.ArrayList; -import java.util.function.Function; - -import static java.util.Collections.singletonList; -import static scala.jdk.javaapi.CollectionConverters.asScala; - -public class TypeCastingOpDesc extends MapOpDesc { - - @JsonProperty(required = true) - @JsonSchemaTitle("TypeCasting Units") - @JsonPropertyDescription("Multiple type castings") - public java.util.List typeCastingUnits = new ArrayList<>(); - - @Override - public PhysicalOp getPhysicalOp(WorkflowIdentity workflowId, ExecutionIdentity executionId) { - if (typeCastingUnits == null) typeCastingUnits = new ArrayList<>(); - return PhysicalOp.oneToOnePhysicalOp( - workflowId, - executionId, - operatorIdentifier(), - OpExecInitInfo.apply( - (Function, OperatorExecutor> & java.io.Serializable) - worker -> new edu.uci.ics.amber.operator.typecasting.TypeCastingOpExec(typeCastingUnits) - ) - ) - .withInputPorts(operatorInfo().inputPorts()) - .withOutputPorts(operatorInfo().outputPorts()) - .withPropagateSchema( - SchemaPropagationFunc.apply((Function, Map> & Serializable) inputSchemas -> { - // Initialize a Java HashMap - java.util.Map javaMap = new java.util.HashMap<>(); - Schema outputSchema = inputSchemas.values().head(); - if (typeCastingUnits != null) { - for (edu.uci.ics.amber.operator.typecasting.TypeCastingUnit unit : typeCastingUnits) { - outputSchema = AttributeTypeUtils.SchemaCasting(outputSchema, unit.attribute, unit.resultType); - } - } - - javaMap.put(operatorInfo().outputPorts().head().id(), outputSchema); - - // Convert the Java Map to a Scala immutable Map - return OperatorDescriptorUtils.toImmutableMap(javaMap); - }) - ); - } - - @Override - public OperatorInfo operatorInfo() { - return new OperatorInfo( - "Type Casting", - "Cast between types", - OperatorGroupConstants.CLEANING_GROUP(), - asScala(singletonList(new InputPort(new PortIdentity(0, false), "", false, asScala(new ArrayList()).toSeq()))).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), - false, - false, - false, - false - ); - } - - @Override - public Schema getOutputSchema(Schema[] schemas) { - Schema outputSchema = schemas[0]; - if (typeCastingUnits != null) { - for (edu.uci.ics.amber.operator.typecasting.TypeCastingUnit unit : typeCastingUnits) { - outputSchema = AttributeTypeUtils.SchemaCasting(outputSchema, unit.attribute, unit.resultType); - } - } - return outputSchema; - } -} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala new file mode 100644 index 00000000000..91741b73d05 --- /dev/null +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala @@ -0,0 +1,59 @@ +package edu.uci.ics.amber.operator.typecasting + +import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} +import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle +import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.tuple.{AttributeTypeUtils, Schema} +import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.operator.map.MapOpDesc +import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} + +class TypeCastingOpDesc extends MapOpDesc { + + @JsonProperty(required = true) + @JsonSchemaTitle("TypeCasting Units") + @JsonPropertyDescription("Multiple type castings") + var typeCastingUnits: List[TypeCastingUnit] = List.empty + + override def getPhysicalOp( + workflowId: WorkflowIdentity, + executionId: ExecutionIdentity + ): PhysicalOp = { + if (typeCastingUnits == null) typeCastingUnits = List.empty + PhysicalOp + .oneToOnePhysicalOp( + workflowId, + executionId, + operatorIdentifier, + OpExecInitInfo((_, _) => new TypeCastingOpExec(typeCastingUnits)) + ) + .withInputPorts(operatorInfo.inputPorts) + .withOutputPorts(operatorInfo.outputPorts) + .withPropagateSchema( + SchemaPropagationFunc { inputSchemas: Map[PortIdentity, Schema] => + val outputSchema = typeCastingUnits.foldLeft(inputSchemas.values.head) { (schema, unit) => + AttributeTypeUtils.SchemaCasting(schema, unit.attribute, unit.resultType) + } + Map(operatorInfo.outputPorts.head.id -> outputSchema) + } + ) + } + + override def operatorInfo: OperatorInfo = { + OperatorInfo( + "Type Casting", + "Cast between types", + OperatorGroupConstants.CLEANING_GROUP, + List(InputPort()), + List(OutputPort()) + ) + } + + override def getOutputSchema(schemas: Array[Schema]): Schema = { + typeCastingUnits.foldLeft(schemas.head) { (schema, unit) => + AttributeTypeUtils.SchemaCasting(schema, unit.attribute, unit.resultType) + } + } +} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExec.scala index 84ecc1e7341..998d0504583 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExec.scala @@ -3,15 +3,13 @@ package edu.uci.ics.amber.operator.typecasting import edu.uci.ics.amber.core.tuple.{AttributeTypeUtils, Tuple, TupleLike} import edu.uci.ics.amber.operator.map.MapOpExec -import scala.jdk.CollectionConverters.CollectionHasAsScala - -class TypeCastingOpExec(typeCastingUnits: java.util.List[TypeCastingUnit]) extends MapOpExec { +class TypeCastingOpExec(typeCastingUnits: List[TypeCastingUnit]) extends MapOpExec { this.setMapFunc(castTuple) private def castTuple(tuple: Tuple): TupleLike = AttributeTypeUtils.tupleCasting( tuple, - typeCastingUnits.asScala + typeCastingUnits .map(typeCastingUnit => typeCastingUnit.attribute -> typeCastingUnit.resultType) .toMap ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.java deleted file mode 100644 index 944f171c397..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.java +++ /dev/null @@ -1,120 +0,0 @@ -package edu.uci.ics.amber.operator.udf.python.source; - -import com.fasterxml.jackson.annotation.JsonProperty; -import com.fasterxml.jackson.annotation.JsonPropertyDescription; -import com.google.common.base.Preconditions; -import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle; -import edu.uci.ics.amber.core.executor.OpExecInitInfo; -import edu.uci.ics.amber.core.tuple.Attribute; -import edu.uci.ics.amber.core.tuple.Schema; -import edu.uci.ics.amber.core.workflow.PhysicalOp; -import edu.uci.ics.amber.core.workflow.SchemaPropagationFunc; -import edu.uci.ics.amber.operator.metadata.OperatorGroupConstants; -import edu.uci.ics.amber.operator.metadata.OperatorInfo; -import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor; -import edu.uci.ics.amber.operator.util.OperatorDescriptorUtils; -import edu.uci.ics.amber.virtualidentity.ExecutionIdentity; -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity; -import edu.uci.ics.amber.workflow.InputPort; -import edu.uci.ics.amber.workflow.OutputPort; -import edu.uci.ics.amber.workflow.PortIdentity; -import scala.Option; -import scala.collection.immutable.Map; - -import java.io.Serializable; -import java.util.ArrayList; -import java.util.List; -import java.util.function.Function; - -import static java.util.Collections.singletonList; -import static scala.jdk.javaapi.CollectionConverters.asScala; - - -public class PythonUDFSourceOpDescV2 extends SourceOperatorDescriptor { - - @JsonProperty(required = true, defaultValue = - "# Choose from the following templates:\n" + - "# \n" + - "# from pytexera import *\n" + - "# \n" + - "# class GenerateOperator(UDFSourceOperator):\n" + - "# \n" + - "# @overrides\n" + - "# def produce(self) -> Iterator[Union[TupleLike, TableLike, None]]:\n" + - "# yield\n") - @JsonSchemaTitle("Python script") - @JsonPropertyDescription("Input your code here") - public String code; - - @JsonProperty(required = true, defaultValue = "1") - @JsonSchemaTitle("Worker count") - @JsonPropertyDescription("Specify how many parallel workers to lunch") - public Integer workers = 1; - - @JsonProperty() - @JsonSchemaTitle("Columns") - @JsonPropertyDescription("The columns of the source") - public List columns; - - @Override - public PhysicalOp getPhysicalOp(WorkflowIdentity workflowId, ExecutionIdentity executionId) { - OpExecInitInfo exec = OpExecInitInfo.apply(code, "python"); - Preconditions.checkArgument(workers >= 1, "Need at least 1 worker."); - SchemaPropagationFunc func = SchemaPropagationFunc.apply((Function, Map> & Serializable) inputSchemas -> { - // Initialize a Java HashMap - java.util.Map javaMap = new java.util.HashMap<>(); - - javaMap.put(operatorInfo().outputPorts().head().id(), sourceSchema()); - - // Convert the Java Map to a Scala immutable Map - return OperatorDescriptorUtils.toImmutableMap(javaMap); - }); - PhysicalOp physicalOp = PhysicalOp.sourcePhysicalOp( - workflowId, - executionId, - operatorIdentifier(), - exec - ) - .withInputPorts(operatorInfo().inputPorts()) - .withOutputPorts(operatorInfo().outputPorts()) - .withIsOneToManyOp(true) - .withPropagateSchema(func) - .withLocationPreference(Option.empty()); - - - if (workers > 1) { - return physicalOp - .withParallelizable(true) - .withSuggestedWorkerNum(workers); - } else { - return physicalOp.withParallelizable(false); - } - - } - - @Override - public OperatorInfo operatorInfo() { - return new OperatorInfo( - "1-out Python UDF", - "User-defined function operator in Python script", - OperatorGroupConstants.PYTHON_GROUP(), - asScala(new ArrayList()).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), - false, - false, - true, - false - ); - } - - @Override - public Schema sourceSchema() { - Schema.Builder outputSchemaBuilder = Schema.builder(); - - // for any pythonUDFType, it can add custom output columns (attributes). - if (columns != null) { - outputSchemaBuilder.add(asScala(columns)).build(); - } - return outputSchemaBuilder.build(); - } -} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala new file mode 100644 index 00000000000..91642ec8b3e --- /dev/null +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala @@ -0,0 +1,86 @@ +package edu.uci.ics.amber.operator.udf.python.source + +import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} +import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle +import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.tuple.{Attribute, Schema} +import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor +import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.workflow.{OutputPort, PortIdentity} + +class PythonUDFSourceOpDescV2 extends SourceOperatorDescriptor { + + @JsonProperty( + required = true, + defaultValue = "# from pytexera import *\n" + + "# class GenerateOperator(UDFSourceOperator):\n" + + "# \n" + + "# @overrides\n" + + "# \n" + + "# def produce(self) -> Iterator[Union[TupleLike, TableLike, None]]:\n" + + "# yield\n" + ) + @JsonSchemaTitle("Python script") + @JsonPropertyDescription("Input your code here") + var code: String = _ + + @JsonProperty(required = true, defaultValue = "1") + @JsonSchemaTitle("Worker count") + @JsonPropertyDescription("Specify how many parallel workers to launch") + var workers: Int = 1 + + @JsonProperty() + @JsonSchemaTitle("Columns") + @JsonPropertyDescription("The columns of the source") + var columns: List[Attribute] = List.empty + + override def getPhysicalOp( + workflowId: WorkflowIdentity, + executionId: ExecutionIdentity + ): PhysicalOp = { + val exec = OpExecInitInfo(code, "python") + require(workers >= 1, "Need at least 1 worker.") + + val func = SchemaPropagationFunc { _: Map[PortIdentity, Schema] => + val outputSchema = sourceSchema() + Map(operatorInfo.outputPorts.head.id -> outputSchema) + } + + val physicalOp = PhysicalOp + .sourcePhysicalOp(workflowId, executionId, operatorIdentifier, exec) + .withInputPorts(operatorInfo.inputPorts) + .withOutputPorts(operatorInfo.outputPorts) + .withIsOneToManyOp(true) + .withPropagateSchema(func) + .withLocationPreference(Option.empty) + + if (workers > 1) { + physicalOp + .withParallelizable(true) + .withSuggestedWorkerNum(workers) + } else { + physicalOp.withParallelizable(false) + } + } + + override def operatorInfo: OperatorInfo = { + OperatorInfo( + "1-out Python UDF", + "User-defined function operator in Python script", + OperatorGroupConstants.PYTHON_GROUP, + List.empty, // No input ports for a source operator + List(OutputPort()), + supportReconfiguration = true + ) + } + + override def sourceSchema(): Schema = { + val outputSchemaBuilder = Schema.builder() + if (columns.nonEmpty && columns != null) { + outputSchemaBuilder.add(columns) + } + outputSchemaBuilder.build() + } +} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.java deleted file mode 100644 index 2b5785d9468..00000000000 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.java +++ /dev/null @@ -1,129 +0,0 @@ -package edu.uci.ics.amber.operator.udf.r; - -import com.fasterxml.jackson.annotation.JsonProperty; -import com.fasterxml.jackson.annotation.JsonPropertyDescription; -import com.google.common.base.Preconditions; -import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle; -import edu.uci.ics.amber.core.executor.OpExecInitInfo; -import edu.uci.ics.amber.core.tuple.Attribute; -import edu.uci.ics.amber.core.tuple.Schema; -import edu.uci.ics.amber.core.workflow.PhysicalOp; -import edu.uci.ics.amber.core.workflow.SchemaPropagationFunc; -import edu.uci.ics.amber.operator.metadata.OperatorGroupConstants; -import edu.uci.ics.amber.operator.metadata.OperatorInfo; -import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor; -import edu.uci.ics.amber.operator.util.OperatorDescriptorUtils; -import edu.uci.ics.amber.virtualidentity.ExecutionIdentity; -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity; -import edu.uci.ics.amber.workflow.InputPort; -import edu.uci.ics.amber.workflow.OutputPort; -import edu.uci.ics.amber.workflow.PortIdentity; -import scala.Option; -import scala.collection.immutable.Map; - -import java.io.Serializable; -import java.util.ArrayList; -import java.util.List; -import java.util.function.Function; - -import static java.util.Collections.singletonList; -import static scala.jdk.javaapi.CollectionConverters.asScala; - - -public class RUDFSourceOpDesc extends SourceOperatorDescriptor { - @JsonProperty( - required = true, - defaultValue = - "# If using Table API:\n" + - "# function() { \n" + - "# return (data.frame(Column_Here = \"Value_Here\")) \n" + - "# }\n" + - "\n" + - "# If using Tuple API:\n" + - "# library(coro)\n" + - "# coro::generator(function() {\n" + - "# yield (list(text= \"hello world!\"))\n" + - "# })" - ) - @JsonSchemaTitle("R Source UDF Script") - @JsonPropertyDescription("Input your code here") - public String code; - - @JsonProperty(required = true, defaultValue = "1") - @JsonSchemaTitle("Worker count") - @JsonPropertyDescription("Specify how many parallel workers to lunch") - public Integer workers = 1; - - @JsonProperty(required = true, defaultValue = "false") - @JsonSchemaTitle("Use Tuple API?") - @JsonPropertyDescription("Check this box to use Tuple API, leave unchecked to use Table API") - public Boolean useTupleAPI = false; - - @JsonProperty() - @JsonSchemaTitle("Columns") - @JsonPropertyDescription("The columns of the source") - public List columns; - - @Override - public PhysicalOp getPhysicalOp(WorkflowIdentity workflowId, ExecutionIdentity executionId) { - String r_operator_type = useTupleAPI ? "r-tuple" : "r-table"; - OpExecInitInfo exec = OpExecInitInfo.apply(code, r_operator_type); - Preconditions.checkArgument(workers >= 1, "Need at least 1 worker."); - SchemaPropagationFunc func = SchemaPropagationFunc.apply((Function, Map> & Serializable) inputSchemas -> { - // Initialize a Java HashMap - java.util.Map javaMap = new java.util.HashMap<>(); - - javaMap.put(operatorInfo().outputPorts().head().id(), sourceSchema()); - - // Convert the Java Map to a Scala immutable Map - return OperatorDescriptorUtils.toImmutableMap(javaMap); - }); - PhysicalOp physicalOp = PhysicalOp.sourcePhysicalOp( - workflowId, - executionId, - operatorIdentifier(), - exec - ) - .withInputPorts(operatorInfo().inputPorts()) - .withOutputPorts(operatorInfo().outputPorts()) - .withIsOneToManyOp(true) - .withPropagateSchema(func) - .withLocationPreference(Option.empty()); - - - if (workers > 1) { - return physicalOp - .withParallelizable(true) - .withSuggestedWorkerNum(workers); - } else { - return physicalOp.withParallelizable(false); - } - - } - - @Override - public OperatorInfo operatorInfo() { - return new OperatorInfo( - "1-out R UDF", - "User-defined function operator in R script", - OperatorGroupConstants.R_GROUP(), - asScala(new ArrayList()).toList(), - asScala(singletonList(new OutputPort(new PortIdentity(0, false), "", false, OutputPort.OutputMode$.MODULE$.fromValue(0)))).toList(), - false, - false, - false, - false - ); - } - - @Override - public Schema sourceSchema() { - Schema.Builder outputSchemaBuilder = Schema.builder(); - - // for any UDFType, it can add custom output columns (attributes). - if (columns != null) { - outputSchemaBuilder.add(asScala(columns)).build(); - } - return outputSchemaBuilder.build(); - } -} \ No newline at end of file diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala new file mode 100644 index 00000000000..7dd22c92436 --- /dev/null +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala @@ -0,0 +1,94 @@ +package edu.uci.ics.amber.operator.udf.r + +import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} +import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle +import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.tuple.{Attribute, Schema} +import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor +import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.workflow.{OutputPort, PortIdentity} + +class RUDFSourceOpDesc extends SourceOperatorDescriptor { + + @JsonProperty( + required = true, + defaultValue = "# If using Table API:\n" + + "# function() { \n" + + "# return (data.frame(Column_Here = \"Value_Here\")) \n" + + "# }\n" + + "\n" + + "# If using Tuple API:\n" + + "# library(coro)\n" + + "# coro::generator(function() {\n" + + "# yield (list(text= \"hello world!\"))\n" + + "# })" + ) + @JsonSchemaTitle("R Source UDF Script") + @JsonPropertyDescription("Input your code here") + var code: String = _ + + @JsonProperty(required = true, defaultValue = "1") + @JsonSchemaTitle("Worker count") + @JsonPropertyDescription("Specify how many parallel workers to launch") + var workers: Int = 1 + + @JsonProperty(required = true, defaultValue = "false") + @JsonSchemaTitle("Use Tuple API?") + @JsonPropertyDescription("Check this box to use Tuple API, leave unchecked to use Table API") + var useTupleAPI: Boolean = false + + @JsonProperty() + @JsonSchemaTitle("Columns") + @JsonPropertyDescription("The columns of the source") + var columns: List[Attribute] = List.empty + + override def getPhysicalOp( + workflowId: WorkflowIdentity, + executionId: ExecutionIdentity + ): PhysicalOp = { + val rOperatorType = if (useTupleAPI) "r-tuple" else "r-table" + val exec = OpExecInitInfo(code, rOperatorType) + require(workers >= 1, "Need at least 1 worker.") + + val func = SchemaPropagationFunc { _: Map[PortIdentity, Schema] => + val outputSchema = sourceSchema() + Map(operatorInfo.outputPorts.head.id -> outputSchema) + } + + val physicalOp = PhysicalOp + .sourcePhysicalOp(workflowId, executionId, operatorIdentifier, exec) + .withInputPorts(operatorInfo.inputPorts) + .withOutputPorts(operatorInfo.outputPorts) + .withIsOneToManyOp(true) + .withPropagateSchema(func) + .withLocationPreference(None) + + if (workers > 1) { + physicalOp + .withParallelizable(true) + .withSuggestedWorkerNum(workers) + } else { + physicalOp.withParallelizable(false) + } + } + + override def operatorInfo: OperatorInfo = { + OperatorInfo( + "1-out R UDF", + "User-defined function operator in R script", + OperatorGroupConstants.R_GROUP, + List.empty, // No input ports for a source operator + List(OutputPort()) + ) + } + + override def sourceSchema(): Schema = { + val outputSchemaBuilder = Schema.builder() + if (columns.nonEmpty && columns != null) { + outputSchemaBuilder.add(columns) + } + outputSchemaBuilder.build() + } +} diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala index 7266f6168dd..2826630cee3 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala @@ -4,7 +4,6 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema, Tuple} import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec -import java.util.Arrays.asList class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { val inputPort: Int = 0 @@ -45,7 +44,7 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .build() it should "open and close" in { - val opExec = new SpecializedFilterOpExec(asList()) + val opExec = new SpecializedFilterOpExec(List()) opExec.open() opExec.close() } @@ -60,7 +59,7 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "do nothing when predicates is an empty list" in { - val opExec = new SpecializedFilterOpExec(asList()) + val opExec = new SpecializedFilterOpExec(List()) opExec.open() assert(opExec.processTuple(allNullTuple, inputPort).isEmpty) opExec.close() @@ -68,16 +67,16 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { it should "not have is_null comparisons be affected by values" in { val opExec = new SpecializedFilterOpExec( - asList(new FilterPredicate("string", ComparisonType.IS_NULL, "value")) + List(new FilterPredicate("string", ComparisonType.IS_NULL, "value")) ) opExec.open() - assert(!opExec.processTuple(allNullTuple, inputPort).isEmpty) + assert(opExec.processTuple(allNullTuple, inputPort).nonEmpty) opExec.close() } it should "not have is_not_null comparisons be affected by values" in { val opExec = new SpecializedFilterOpExec( - asList(new FilterPredicate("string", ComparisonType.IS_NOT_NULL, "value")) + List(new FilterPredicate("string", ComparisonType.IS_NOT_NULL, "value")) ) opExec.open() assert(opExec.processTuple(allNullTuple, inputPort).isEmpty) @@ -91,7 +90,7 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { assert(attributes.length == 1) val opExec = new SpecializedFilterOpExec( - asList(new FilterPredicate(attributes(0).getName, ComparisonType.IS_NULL, null)) + List(new FilterPredicate(attributes.head.getName, ComparisonType.IS_NULL, null)) ) opExec.open() @@ -102,7 +101,7 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { it should "filter out non null tuples when filtering is_null" in { val opExec = new SpecializedFilterOpExec( - asList(new FilterPredicate("string", ComparisonType.IS_NULL, "value")) + List(new FilterPredicate("string", ComparisonType.IS_NULL, "value")) ) opExec.open() assert(opExec.processTuple(nonNullTuple, inputPort).isEmpty) @@ -111,7 +110,7 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { it should "output non null tuples when filter is_not_null" in { val opExec = new SpecializedFilterOpExec( - asList(new FilterPredicate("string", ComparisonType.IS_NOT_NULL, "value")) + List(new FilterPredicate("string", ComparisonType.IS_NOT_NULL, "value")) ) opExec.open() assert(opExec.processTuple(nonNullTuple, inputPort).nonEmpty) @@ -125,7 +124,7 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { assert(attributes.length == 1) val opExec = new SpecializedFilterOpExec( - asList(new FilterPredicate(attributes(0).getName, ComparisonType.IS_NOT_NULL, null)) + List(new FilterPredicate(attributes.head.getName, ComparisonType.IS_NOT_NULL, null)) ) opExec.open() diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala index 5b1bad8f301..ae1070f62ba 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala @@ -25,9 +25,7 @@ class TypeCastingOpExecSpec extends AnyFlatSpec with BeforeAndAfter { val castingUnit2 = new TypeCastingUnit() castingUnit2.attribute = "field3" castingUnit2.resultType = AttributeType.STRING - val castingUnits: java.util.List[TypeCastingUnit] = new java.util.ArrayList() - castingUnits.add(castingUnit1) - castingUnits.add(castingUnit2) + val castingUnits: List[TypeCastingUnit] = List(castingUnit1, castingUnit2) val tuple: Tuple = Tuple .builder(tupleSchema) From 2c028f89e33283eedf751a0fc0b98bb7c1e7c4ec Mon Sep 17 00:00:00 2001 From: Shengquan Ni <13672781+shengquan-ni@users.noreply.github.com> Date: Mon, 30 Dec 2024 00:54:55 -0800 Subject: [PATCH 18/47] Update .gitignore to include Metals generated folders to support VSCode-based IDEs (#3180) Add Scala([Metals](https://github.com/scalameta/metals)) generated folders to gitignore to support VSCode and Cursor IDE. Metals is a popular plugin for VSCode-based IDE to develop scala applications. --- .gitignore | 8 +++++++- 1 file changed, 7 insertions(+), 1 deletion(-) diff --git a/.gitignore b/.gitignore index cda7b56ffe3..4fbd384afe6 100644 --- a/.gitignore +++ b/.gitignore @@ -102,4 +102,10 @@ StoredCredential* **/apache2/ **/Apache24/ **/php/ -Composer-Setup.exe \ No newline at end of file +Composer-Setup.exe + +# Ignoring folders generated by vscode IDE +.metals/ +.bloop/ +.ammonite/ +metals.sbt \ No newline at end of file From 35f184982e308f66ec62030acc4196ab5c972288 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Mon, 30 Dec 2024 12:37:16 -0800 Subject: [PATCH 19/47] Move protobuf definitions under core package (#3181) This PR moves all definitions of protobuf under workflow-core to be within the core package name. --- .../architecture/rpc/controlcommands.proto | 54 +++++++++---------- .../sendsemantics/partitionings.proto | 13 +++-- .../architecture/worker/statistics.proto | 4 +- .../amber/engine/common/ambermessage.proto | 6 +-- .../engine/common/executionruntimestate.proto | 20 +++---- .../amber/clustering/ClusterListener.scala | 6 +-- .../common/AkkaActorRefMappingService.scala | 2 +- .../common/AkkaActorService.scala | 2 +- .../common/AkkaMessageTransferService.scala | 2 +- .../architecture/common/AmberProcessor.scala | 2 +- .../common/ProcessingStepCursor.scala | 2 +- .../architecture/common/WorkflowActor.scala | 2 +- .../architecture/controller/ClientEvent.scala | 2 +- .../architecture/controller/Controller.scala | 2 +- ...ControllerAsyncRPCHandlerInitializer.scala | 2 +- .../controller/ControllerProcessor.scala | 2 +- .../controller/GlobalReplayManager.scala | 2 +- .../controller/WorkflowScheduler.scala | 2 +- .../controller/execution/LinkExecution.scala | 2 +- .../execution/OperatorExecution.scala | 4 +- .../execution/RegionExecution.scala | 4 +- .../execution/WorkflowExecution.scala | 2 +- .../ChannelMarkerHandler.scala | 2 +- .../promisehandlers/DebugCommandHandler.scala | 2 +- .../EvaluatePythonExpressionHandler.scala | 2 +- .../promisehandlers/PauseHandler.scala | 2 +- .../RetrieveWorkflowStateHandler.scala | 2 +- .../TakeGlobalCheckpointHandler.scala | 2 +- .../layer/WorkerExecution.scala | 2 +- .../logreplay/EmptyReplayLogger.scala | 2 +- .../logreplay/OrderEnforcer.scala | 2 +- .../logreplay/ReplayLogGenerator.scala | 2 +- .../logreplay/ReplayLogManager.scala | 2 +- .../architecture/logreplay/ReplayLogger.scala | 2 +- .../logreplay/ReplayLoggerImpl.scala | 2 +- .../logreplay/ReplayOrderEnforcer.scala | 2 +- .../messaginglayer/AmberFIFOChannel.scala | 4 +- .../messaginglayer/InputGateway.scala | 2 +- .../messaginglayer/InputManager.scala | 4 +- .../messaginglayer/NetworkInputGateway.scala | 2 +- .../messaginglayer/NetworkOutputGateway.scala | 2 +- .../messaginglayer/OutputManager.scala | 4 +- .../messaginglayer/WorkerPort.scala | 2 +- .../pythonworker/PythonProxyClient.scala | 2 +- .../pythonworker/PythonProxyServer.scala | 2 +- .../pythonworker/PythonWorkflowWorker.scala | 2 +- .../WorkerBatchInternalQueue.scala | 2 +- .../CostBasedScheduleGenerator.scala | 4 +- .../ExpansionGreedyScheduleGenerator.scala | 4 +- .../architecture/scheduling/Region.scala | 4 +- .../RegionExecutionCoordinator.scala | 2 +- .../architecture/scheduling/RegionPlan.scala | 2 +- .../scheduling/ScheduleGenerator.scala | 4 +- .../WorkflowExecutionCoordinator.scala | 2 +- .../scheduling/config/ChannelConfig.scala | 4 +- .../scheduling/config/LinkConfig.scala | 2 +- .../scheduling/config/ResourceConfig.scala | 4 +- .../scheduling/config/WorkerConfig.scala | 2 +- .../resourcePolicies/ResourceAllocator.scala | 4 +- .../partitioners/BroadcastPartitioner.scala | 2 +- .../HashBasedShufflePartitioner.scala | 2 +- .../partitioners/OneToOnePartitioner.scala | 2 +- .../partitioners/Partitioner.scala | 2 +- .../RangeBasedShufflePartitioner.scala | 2 +- .../partitioners/RoundRobinPartitioner.scala | 2 +- .../worker/ChannelMarkerManager.scala | 2 +- .../engine/architecture/worker/DPThread.scala | 2 +- .../architecture/worker/DataProcessor.scala | 4 +- .../DataProcessorRPCHandlerInitializer.scala | 2 +- .../architecture/worker/PauseManager.scala | 2 +- .../architecture/worker/PauseType.scala | 2 +- .../architecture/worker/WorkflowWorker.scala | 2 +- .../managers/SerializationManager.scala | 4 +- .../worker/managers/StatisticsManager.scala | 2 +- .../PrepareCheckpointHandler.scala | 2 +- .../worker/promisehandlers/StartHandler.scala | 4 +- .../amber/engine/common/AmberLogging.scala | 2 +- .../engine/common/CheckpointSupport.scala | 2 +- .../common/ambermessage/RecoveryPayload.scala | 2 +- .../common/ambermessage/WorkflowMessage.scala | 2 +- .../engine/common/client/ClientActor.scala | 2 +- .../engine/common/rpc/AsyncRPCClient.scala | 2 +- .../rpc/AsyncRPCHandlerInitializer.scala | 2 +- .../engine/common/rpc/AsyncRPCServer.scala | 2 +- .../common/statetransition/StateManager.scala | 2 +- .../statetransition/WorkerStateManager.scala | 2 +- .../engine/common/virtualidentity/util.scala | 2 +- .../edu/uci/ics/amber/error/ErrorUtils.scala | 2 +- .../ics/texera/web/ComputingUnitMaster.scala | 2 +- .../websocket/event/WorkflowErrorEvent.scala | 2 +- .../resource/WorkflowWebsocketResource.scala | 6 +-- .../workflow/WorkflowExecutionsResource.scala | 2 +- .../web/service/ExecutionConsoleService.scala | 2 +- .../web/service/ExecutionResultService.scala | 6 +-- .../web/service/ExecutionRuntimeService.scala | 2 +- .../web/service/ExecutionStatsService.scala | 4 +- .../ExecutionsMetadataPersistService.scala | 2 +- .../FriesReconfigurationAlgorithm.scala | 2 +- .../web/service/ResultExportService.scala | 4 +- .../texera/web/service/WorkflowService.scala | 6 +-- .../ExecutionReconfigurationStore.scala | 2 +- .../uci/ics/texera/workflow/LogicalLink.scala | 4 +- .../uci/ics/texera/workflow/LogicalPlan.scala | 2 +- .../texera/workflow/WorkflowCompiler.scala | 6 +-- .../control/TrivialControlSpec.scala | 2 +- .../control/utils/MultiCallHandler.scala | 2 +- .../TesterAsyncRPCHandlerInitializer.scala | 2 +- .../control/utils/TrivialControlTester.scala | 2 +- .../NetworkInputGatewaySpec.scala | 2 +- .../messaginglayer/OutputManagerSpec.scala | 4 +- .../RangeBasedShuffleSpec.scala | 2 +- .../PythonWorkflowWorkerSpec.scala | 2 +- .../CostBasedScheduleGeneratorSpec.scala | 2 +- ...ExpansionGreedyScheduleGeneratorSpec.scala | 4 +- .../architecture/worker/DPThreadSpec.scala | 4 +- .../worker/DataProcessorSpec.scala | 4 +- .../architecture/worker/WorkerSpec.scala | 4 +- .../engine/e2e/BatchSizePropagationSpec.scala | 2 +- .../amber/engine/e2e/DataProcessingSpec.scala | 4 +- .../uci/ics/amber/engine/e2e/PauseSpec.scala | 2 +- .../faulttolerance/CheckpointSpec.scala | 2 +- .../engine/faulttolerance/LoggingSpec.scala | 4 +- .../engine/faulttolerance/ReplaySpec.scala | 2 +- .../ics/amber/compiler/WorkflowCompiler.scala | 8 +-- .../amber/compiler/model/LogicalLink.scala | 4 +- .../amber/compiler/model/LogicalPlan.scala | 4 +- .../WorkflowCompilationResource.scala | 4 +- .../WorkflowCompilationResourceSpec.scala | 2 +- .../amber/{ => core}/virtualidentity.proto | 2 +- .../uci/ics/amber/{ => core}/workflow.proto | 4 +- .../{ => core}/workflowruntimestate.proto | 2 +- .../amber/core/WorkflowRuntimeException.scala | 2 +- .../core/executor/OperatorExecutor.scala | 2 +- .../core/executor/SinkOperatorExecutor.scala | 2 +- .../executor/SourceOperatorExecutor.scala | 2 +- .../core/storage/result/OpResultStorage.scala | 4 +- .../core/storage/result/ResultStorage.scala | 2 +- .../storage/result/WorkflowResultStore.scala | 2 +- .../util/mongo/MongoCollectionManager.scala | 2 +- .../uci/ics/amber/core/tuple/TupleLike.scala | 2 +- .../ics/amber/core/workflow/PhysicalOp.scala | 4 +- .../amber/core/workflow/PhysicalPlan.scala | 4 +- .../amber/core/workflow/WorkflowContext.scala | 2 +- .../edu/uci/ics/amber/util/JSONUtils.scala | 2 +- .../ics/amber/util/VirtualIdentityUtils.scala | 2 +- .../serde/PortIdentityKeyDeserializer.scala | 2 +- .../serde/PortIdentityKeySerializer.scala | 2 +- .../uci/ics/amber/operator/LogicalOp.scala | 8 ++- .../operator/PythonOperatorDescriptor.scala | 2 +- .../operator/SpecialPhysicalOpFactory.scala | 13 +++-- .../operator/aggregate/AggregateOpDesc.scala | 8 ++- .../CartesianProductOpDesc.scala | 4 +- .../dictionary/DictionaryMatcherOpDesc.scala | 4 +- .../difference/DifferenceOpDesc.scala | 4 +- .../operator/distinct/DistinctOpDesc.scala | 4 +- .../amber/operator/dummy/DummyOpDesc.scala | 2 +- .../amber/operator/filter/FilterOpDesc.scala | 2 +- .../filter/SpecializedFilterOpDesc.scala | 4 +- .../operator/hashJoin/HashJoinOpDesc.scala | 8 ++- ...gingFaceIrisLogisticRegressionOpDesc.scala | 2 +- .../HuggingFaceSentimentAnalysisOpDesc.scala | 2 +- .../HuggingFaceSpamSMSDetectionOpDesc.scala | 2 +- .../HuggingFaceTextSummarizationOpDesc.scala | 2 +- .../operator/intersect/IntersectOpDesc.scala | 4 +- .../intervalJoin/IntervalJoinOpDesc.scala | 4 +- .../keywordSearch/KeywordSearchOpDesc.scala | 4 +- .../amber/operator/limit/LimitOpDesc.scala | 4 +- .../Scorer/MachineLearningScorerOpDesc.scala | 2 +- .../base/SklearnAdvancedBaseDesc.scala | 2 +- .../ics/amber/operator/map/MapOpDesc.scala | 2 +- .../metadata/OperatorMetadataGenerator.scala | 2 +- .../projection/ProjectionOpDesc.scala | 4 +- .../RandomKSamplingOpDesc.scala | 4 +- .../amber/operator/regex/RegexOpDesc.scala | 4 +- .../ReservoirSamplingOpDesc.scala | 4 +- .../sentiment/SentimentAnalysisOpDesc.scala | 4 +- .../sink/managed/ProgressiveSinkOpExec.scala | 6 +-- .../sklearn/SklearnClassifierOpDesc.scala | 2 +- .../SklearnLinearRegressionOpDesc.scala | 2 +- .../sklearn/SklearnPredictionOpDesc.scala | 2 +- .../ics/amber/operator/sort/SortOpDesc.scala | 2 +- .../sortPartitions/SortPartitionsOpDesc.scala | 4 +- .../reddit/RedditSearchSourceOpDesc.scala | 2 +- .../apis/twitter/TwitterSourceOpDesc.scala | 2 +- ...TwitterFullArchiveSearchSourceOpDesc.scala | 2 +- .../v2/TwitterSearchSourceOpDesc.scala | 2 +- .../source/cache/CacheSourceOpExec.scala | 2 +- .../source/fetcher/URLFetcherOpDesc.scala | 4 +- .../source/scan/FileScanSourceOpDesc.scala | 2 +- .../source/scan/ScanSourceOpDesc.scala | 2 +- .../source/scan/arrow/ArrowSourceOpDesc.scala | 2 +- .../source/scan/csv/CSVScanSourceOpDesc.scala | 2 +- .../csv/ParallelCSVScanSourceOpDesc.scala | 2 +- .../scan/csvOld/CSVOldScanSourceOpDesc.scala | 2 +- .../scan/json/JSONLScanSourceOpDesc.scala | 2 +- .../scan/text/TextInputSourceOpDesc.scala | 4 +- .../sql/asterixdb/AsterixDBSourceOpDesc.scala | 4 +- .../source/sql/mysql/MySQLSourceOpDesc.scala | 4 +- .../postgresql/PostgreSQLSourceOpDesc.scala | 4 +- .../amber/operator/split/SplitOpDesc.scala | 4 +- .../amber/operator/split/SplitOpExec.scala | 2 +- .../SymmetricDifferenceOpDesc.scala | 4 +- .../typecasting/TypeCastingOpDesc.scala | 4 +- .../operator/udf/java/JavaUDFOpDesc.scala | 4 +- .../DualInputPortsPythonUDFOpDescV2.scala | 4 +- .../python/PythonLambdaFunctionOpDesc.scala | 2 +- .../udf/python/PythonTableReducerOpDesc.scala | 2 +- .../udf/python/PythonUDFOpDescV2.scala | 4 +- .../source/PythonUDFSourceOpDescV2.scala | 4 +- .../ics/amber/operator/udf/r/RUDFOpDesc.scala | 4 +- .../operator/udf/r/RUDFSourceOpDesc.scala | 4 +- .../amber/operator/union/UnionOpDesc.scala | 4 +- .../unneststring/UnnestStringOpDesc.scala | 4 +- .../visualization/DotPlot/DotPlotOpDesc.scala | 4 +- .../IcicleChart/IcicleChartOpDesc.scala | 4 +- .../ImageViz/ImageVisualizerOpDesc.scala | 4 +- .../ScatterMatrixChartOpDesc.scala | 4 +- .../barChart/BarChartOpDesc.scala | 4 +- .../visualization/boxPlot/BoxPlotOpDesc.scala | 4 +- .../bubbleChart/BubbleChartOpDesc.scala | 4 +- .../CandlestickChartOpDesc.scala | 4 +- .../ContinuousErrorBandsOpDesc.scala | 4 +- .../contourPlot/ContourPlotOpDesc.scala | 4 +- .../dumbbellPlot/DumbbellPlotOpDesc.scala | 4 +- .../FigureFactoryTableOpDesc.scala | 4 +- .../filledAreaPlot/FilledAreaPlotOpDesc.scala | 4 +- .../funnelPlot/FunnelPlotOpDesc.scala | 4 +- .../ganttChart/GanttChartOpDesc.scala | 4 +- .../visualization/heatMap/HeatMapOpDesc.scala | 4 +- .../hierarchychart/HierarchyChartOpDesc.scala | 4 +- .../histogram/HistogramChartOpDesc.scala | 4 +- .../visualization/htmlviz/HtmlVizOpDesc.scala | 6 +-- .../lineChart/LineChartOpDesc.scala | 4 +- .../pieChart/PieChartOpDesc.scala | 4 +- .../quiverPlot/QuiverPlotOpDesc.scala | 4 +- .../sankeyDiagram/SankeyDiagramOpDesc.scala | 4 +- .../scatter3DChart/Scatter3dChartOpDesc.scala | 4 +- .../scatterplot/ScatterplotOpDesc.scala | 4 +- .../tablesChart/TablesPlotOpDesc.scala | 4 +- .../ternaryPlot/TernaryPlotOpDesc.scala | 4 +- .../visualization/urlviz/UrlVizOpDesc.scala | 6 +-- .../waterfallChart/WaterfallChartOpDesc.scala | 4 +- .../wordCloud/WordCloudOpDesc.scala | 4 +- .../intersect/IntersectOpExecSpec.scala | 4 +- .../intervalJoin/IntervalOpExecSpec.scala | 4 +- .../scan/csv/CSVScanSourceOpDescSpec.scala | 2 +- .../unneststring/UnnestStringOpExecSpec.scala | 2 +- 247 files changed, 424 insertions(+), 410 deletions(-) rename core/workflow-core/src/main/protobuf/edu/uci/ics/amber/{ => core}/virtualidentity.proto (95%) rename core/workflow-core/src/main/protobuf/edu/uci/ics/amber/{ => core}/workflow.proto (93%) rename core/workflow-core/src/main/protobuf/edu/uci/ics/amber/{ => core}/workflowruntimestate.proto (94%) diff --git a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto index 202c355b854..81f7f4b21ba 100644 --- a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto +++ b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto @@ -1,8 +1,8 @@ syntax = "proto3"; package edu.uci.ics.amber.engine.architecture.rpc; -import "edu/uci/ics/amber/virtualidentity.proto"; -import "edu/uci/ics/amber/workflow.proto"; +import "edu/uci/ics/amber/core/virtualidentity.proto"; +import "edu/uci/ics/amber/core/workflow.proto"; import "edu/uci/ics/amber/engine/architecture/worker/statistics.proto"; import "edu/uci/ics/amber/engine/architecture/sendsemantics/partitionings.proto"; import "scalapb/scalapb.proto"; @@ -58,8 +58,8 @@ message EmptyRequest{} message AsyncRPCContext { option (scalapb.message).no_box = true; - ActorVirtualIdentity sender = 1 [(scalapb.field).no_box = true]; - ActorVirtualIdentity receiver = 2 [(scalapb.field).no_box = true]; + core.ActorVirtualIdentity sender = 1 [(scalapb.field).no_box = true]; + core.ActorVirtualIdentity receiver = 2 [(scalapb.field).no_box = true]; } message ControlInvocation { @@ -79,25 +79,25 @@ enum ChannelMarkerType { // Message for ChannelMarkerPayload message ChannelMarkerPayload { option (scalapb.message).extends = "edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessagePayload"; - ChannelMarkerIdentity id = 1 [(scalapb.field).no_box = true]; + core.ChannelMarkerIdentity id = 1 [(scalapb.field).no_box = true]; ChannelMarkerType markerType = 2; - repeated ChannelIdentity scope = 3; + repeated core.ChannelIdentity scope = 3; map commandMapping = 4; } message PropagateChannelMarkerRequest { - repeated PhysicalOpIdentity sourceOpToStartProp = 1; - ChannelMarkerIdentity id = 2 [(scalapb.field).no_box = true]; + repeated core.PhysicalOpIdentity sourceOpToStartProp = 1; + core.ChannelMarkerIdentity id = 2 [(scalapb.field).no_box = true]; ChannelMarkerType markerType = 3; - repeated PhysicalOpIdentity scope = 4; - repeated PhysicalOpIdentity targetOps = 5; + repeated core.PhysicalOpIdentity scope = 4; + repeated core.PhysicalOpIdentity targetOps = 5; ControlRequest markerCommand = 6; string markerMethodName = 7; } message TakeGlobalCheckpointRequest { bool estimationOnly = 1; - ChannelMarkerIdentity checkpointId = 2 [(scalapb.field).no_box = true]; + core.ChannelMarkerIdentity checkpointId = 2 [(scalapb.field).no_box = true]; string destination = 3; } @@ -122,7 +122,7 @@ message ModifyLogicRequest { } message RetryWorkflowRequest { - repeated ActorVirtualIdentity workers = 1; + repeated core.ActorVirtualIdentity workers = 1; } enum ConsoleMessageType{ @@ -147,7 +147,7 @@ message ConsoleMessageTriggeredRequest { } message PortCompletedRequest { - PortIdentity portId = 1 [(scalapb.field).no_box = true]; + core.PortIdentity portId = 1 [(scalapb.field).no_box = true]; bool input = 2; } @@ -156,21 +156,21 @@ message WorkerStateUpdatedRequest { } message LinkWorkersRequest { - PhysicalLink link = 1 [(scalapb.field).no_box = true]; + core.PhysicalLink link = 1 [(scalapb.field).no_box = true]; } // Ping message message Ping { int32 i = 1; int32 end = 2; - ActorVirtualIdentity to = 3 [(scalapb.field).no_box = true]; + core.ActorVirtualIdentity to = 3 [(scalapb.field).no_box = true]; } // Pong message message Pong { int32 i = 1; int32 end = 2; - ActorVirtualIdentity to = 3 [(scalapb.field).no_box = true]; + core.ActorVirtualIdentity to = 3 [(scalapb.field).no_box = true]; } // Pass message @@ -185,7 +185,7 @@ message Nested { // MultiCall message message MultiCall { - repeated ActorVirtualIdentity seq = 1; + repeated core.ActorVirtualIdentity seq = 1; } // ErrorCommand message @@ -194,7 +194,7 @@ message ErrorCommand { // Collect message message Collect { - repeated ActorVirtualIdentity workers = 1; + repeated core.ActorVirtualIdentity workers = 1; } // GenerateNumber message @@ -203,7 +203,7 @@ message GenerateNumber { // Chain message message Chain { - repeated ActorVirtualIdentity nexts = 1; + repeated core.ActorVirtualIdentity nexts = 1; } // Recursion message @@ -213,23 +213,23 @@ message Recursion { // Messages for the commands message AddInputChannelRequest { - ChannelIdentity channelId = 1 [(scalapb.field).no_box = true]; - PortIdentity portId = 2 [(scalapb.field).no_box = true]; + core.ChannelIdentity channelId = 1 [(scalapb.field).no_box = true]; + core.PortIdentity portId = 2 [(scalapb.field).no_box = true]; } message AddPartitioningRequest { - PhysicalLink tag = 1 [(scalapb.field).no_box = true]; + core.PhysicalLink tag = 1 [(scalapb.field).no_box = true]; sendsemantics.Partitioning partitioning = 2 [(scalapb.field).no_box = true]; } message AssignPortRequest { - PortIdentity portId = 1 [(scalapb.field).no_box = true]; + core.PortIdentity portId = 1 [(scalapb.field).no_box = true]; bool input = 2; map schema = 3; } message FinalizeCheckpointRequest { - ChannelMarkerIdentity checkpointId = 1 [(scalapb.field).no_box = true]; + core.ChannelMarkerIdentity checkpointId = 1 [(scalapb.field).no_box = true]; string writeTo = 2; } @@ -241,16 +241,16 @@ message InitializeExecutorRequest { } message UpdateExecutorRequest { - PhysicalOpIdentity targetOpId = 1 [(scalapb.field).no_box = true]; + core.PhysicalOpIdentity targetOpId = 1 [(scalapb.field).no_box = true]; google.protobuf.Any newExecutor = 2 [(scalapb.field).no_box = true]; google.protobuf.Any stateTransferFunc = 3; } message PrepareCheckpointRequest{ - ChannelMarkerIdentity checkpointId = 1 [(scalapb.field).no_box = true]; + core.ChannelMarkerIdentity checkpointId = 1 [(scalapb.field).no_box = true]; bool estimationOnly = 2; } message QueryStatisticsRequest{ - repeated ActorVirtualIdentity filterByWorkers = 1; + repeated core.ActorVirtualIdentity filterByWorkers = 1; } \ No newline at end of file diff --git a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/sendsemantics/partitionings.proto b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/sendsemantics/partitionings.proto index 78c1cf8ab60..dfab32705b2 100644 --- a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/sendsemantics/partitionings.proto +++ b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/sendsemantics/partitionings.proto @@ -2,6 +2,7 @@ syntax = "proto3"; package edu.uci.ics.amber.engine.architecture.sendsemantics; +import "edu/uci/ics/amber/core/virtualidentity.proto"; import "scalapb/scalapb.proto"; option (scalapb.options) = { @@ -10,8 +11,6 @@ option (scalapb.options) = { no_default_values_in_constructor: true }; -import "edu/uci/ics/amber/virtualidentity.proto"; - message Partitioning{ oneof sealed_value{ OneToOnePartitioning oneToOnePartitioning = 1; @@ -24,23 +23,23 @@ message Partitioning{ message OneToOnePartitioning{ int32 batchSize = 1; - repeated ChannelIdentity channels = 2; + repeated core.ChannelIdentity channels = 2; } message RoundRobinPartitioning{ int32 batchSize = 1; - repeated ChannelIdentity channels = 2; + repeated core.ChannelIdentity channels = 2; } message HashBasedShufflePartitioning{ int32 batchSize = 1; - repeated ChannelIdentity channels = 2; + repeated core.ChannelIdentity channels = 2; repeated string hashAttributeNames = 3; } message RangeBasedShufflePartitioning { int32 batchSize = 1; - repeated ChannelIdentity channels = 2; + repeated core.ChannelIdentity channels = 2; repeated string rangeAttributeNames = 3; int64 rangeMin = 4; int64 rangeMax = 5; @@ -48,5 +47,5 @@ message RangeBasedShufflePartitioning { message BroadcastPartitioning{ int32 batchSize = 1; - repeated ChannelIdentity channels = 2; + repeated core.ChannelIdentity channels = 2; } diff --git a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/worker/statistics.proto b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/worker/statistics.proto index a6b64802949..02e8dd39a32 100644 --- a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/worker/statistics.proto +++ b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/worker/statistics.proto @@ -2,7 +2,7 @@ syntax = "proto3"; package edu.uci.ics.amber.engine.architecture.worker; -import "edu/uci/ics/amber/workflow.proto"; +import "edu/uci/ics/amber/core/workflow.proto"; import "scalapb/scalapb.proto"; option (scalapb.options) = { @@ -22,7 +22,7 @@ enum WorkerState { } message PortTupleCountMapping { - PortIdentity port_id = 1 [(scalapb.field).no_box = true]; + core.PortIdentity port_id = 1 [(scalapb.field).no_box = true]; int64 tuple_count = 2; } diff --git a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/common/ambermessage.proto b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/common/ambermessage.proto index e0fbdee43d3..ed22f55fd68 100644 --- a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/common/ambermessage.proto +++ b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/common/ambermessage.proto @@ -4,7 +4,7 @@ package edu.uci.ics.amber.engine.common; import "edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto"; import "edu/uci/ics/amber/engine/architecture/rpc/controlreturns.proto"; -import "edu/uci/ics/amber/virtualidentity.proto"; +import "edu/uci/ics/amber/core/virtualidentity.proto"; import "scalapb/scalapb.proto"; option (scalapb.options) = { @@ -21,11 +21,11 @@ message ControlPayloadV2 { } message PythonDataHeader { - ActorVirtualIdentity tag = 1 [(scalapb.field).no_box = true]; + core.ActorVirtualIdentity tag = 1 [(scalapb.field).no_box = true]; string payload_type = 2; } message PythonControlMessage { - ActorVirtualIdentity tag = 1 [(scalapb.field).no_box = true]; + core.ActorVirtualIdentity tag = 1 [(scalapb.field).no_box = true]; ControlPayloadV2 payload = 2 [(scalapb.field).no_box = true]; } diff --git a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/common/executionruntimestate.proto b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/common/executionruntimestate.proto index 7600422b4bd..f6a092477b3 100644 --- a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/common/executionruntimestate.proto +++ b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/common/executionruntimestate.proto @@ -5,8 +5,8 @@ package edu.uci.ics.amber.engine.common; import "edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto"; import "edu/uci/ics/amber/engine/architecture/rpc/controlreturns.proto"; import "edu/uci/ics/amber/engine/architecture/worker/statistics.proto"; -import "edu/uci/ics/amber/virtualidentity.proto"; -import "edu/uci/ics/amber/workflowruntimestate.proto"; +import "edu/uci/ics/amber/core/virtualidentity.proto"; +import "edu/uci/ics/amber/core/workflowruntimestate.proto"; import "scalapb/scalapb.proto"; option (scalapb.options) = { @@ -36,11 +36,11 @@ message ExecutionBreakpointStore{ } message EvaluatedValueList{ - repeated amber.engine.architecture.rpc.EvaluatedValue values = 1; + repeated architecture.rpc.EvaluatedValue values = 1; } message OperatorConsole{ - repeated edu.uci.ics.amber.engine.architecture.rpc.ConsoleMessage console_messages = 1; + repeated architecture.rpc.ConsoleMessage console_messages = 1; map evaluate_expr_results = 2; } @@ -54,8 +54,8 @@ message OperatorWorkerMapping{ } message OperatorStatistics{ - repeated amber.engine.architecture.worker.PortTupleCountMapping input_count = 1; - repeated amber.engine.architecture.worker.PortTupleCountMapping output_count = 2; + repeated architecture.worker.PortTupleCountMapping input_count = 1; + repeated architecture.worker.PortTupleCountMapping output_count = 2; int32 num_workers = 3; int64 data_processing_time = 4; int64 control_processing_time = 5; @@ -63,7 +63,7 @@ message OperatorStatistics{ } message OperatorMetrics{ - edu.uci.ics.amber.engine.architecture.rpc.WorkflowAggregatedState operator_state = 1 [(scalapb.field).no_box = true]; + architecture.rpc.WorkflowAggregatedState operator_state = 1 [(scalapb.field).no_box = true]; OperatorStatistics operator_statistics = 2 [(scalapb.field).no_box = true]; } @@ -76,8 +76,8 @@ message ExecutionStatsStore { message ExecutionMetadataStore{ - edu.uci.ics.amber.engine.architecture.rpc.WorkflowAggregatedState state = 1; - repeated WorkflowFatalError fatal_errors = 2; - ExecutionIdentity executionId = 3 [(scalapb.field).no_box = true]; + architecture.rpc.WorkflowAggregatedState state = 1; + repeated core.WorkflowFatalError fatal_errors = 2; + core.ExecutionIdentity executionId = 3 [(scalapb.field).no_box = true]; bool is_recovering = 4; } \ No newline at end of file diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/clustering/ClusterListener.scala b/core/amber/src/main/scala/edu/uci/ics/amber/clustering/ClusterListener.scala index 72cff807214..9c639946d3c 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/clustering/ClusterListener.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/clustering/ClusterListener.scala @@ -12,9 +12,9 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregat } import edu.uci.ics.amber.engine.common.{AmberConfig, AmberLogging} import edu.uci.ics.amber.error.ErrorUtils.getStackTraceWithAllCauses -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity -import edu.uci.ics.amber.workflowruntimestate.FatalErrorType.EXECUTION_FAILURE -import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.workflowruntimestate.FatalErrorType.EXECUTION_FAILURE +import edu.uci.ics.amber.core.workflowruntimestate.WorkflowFatalError import edu.uci.ics.texera.web.SessionState import edu.uci.ics.texera.web.model.websocket.response.ClusterStatusUpdateEvent import edu.uci.ics.texera.web.service.{WorkflowExecutionService, WorkflowService} diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaActorRefMappingService.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaActorRefMappingService.scala index 3bf9ce4b7cb..7231695b5d0 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaActorRefMappingService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaActorRefMappingService.scala @@ -10,7 +10,7 @@ import edu.uci.ics.amber.engine.architecture.common.WorkflowActor.{ import edu.uci.ics.amber.engine.common.AmberLogging import edu.uci.ics.amber.engine.common.virtualidentity.util.{CONTROLLER, SELF} import edu.uci.ics.amber.util.VirtualIdentityUtils -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaActorService.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaActorService.scala index 6b870e00867..3cc36a77ead 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaActorService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaActorService.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.engine.architecture.common import akka.actor.{ActorContext, ActorRef, Address, Cancellable, Props} import akka.util.Timeout import edu.uci.ics.amber.engine.common.FutureBijection._ -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import scala.concurrent.ExecutionContext import scala.concurrent.duration.{DurationInt, FiniteDuration} diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaMessageTransferService.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaMessageTransferService.scala index 603d2b2e8c0..4609f309783 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaMessageTransferService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AkkaMessageTransferService.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.engine.architecture.common.WorkflowActor.NetworkMessage import edu.uci.ics.amber.engine.architecture.messaginglayer.{CongestionControl, FlowControl} import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage import edu.uci.ics.amber.engine.common.{AmberConfig, AmberLogging} -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import scala.collection.mutable import scala.concurrent.duration.DurationInt diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AmberProcessor.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AmberProcessor.scala index 3a967e1839e..301f5b086ad 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AmberProcessor.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/AmberProcessor.scala @@ -12,7 +12,7 @@ import edu.uci.ics.amber.engine.architecture.worker.managers.StatisticsManager import edu.uci.ics.amber.engine.common.AmberLogging import edu.uci.ics.amber.engine.common.ambermessage.{ControlPayload, WorkflowFIFOMessage} import edu.uci.ics.amber.engine.common.rpc.{AsyncRPCClient, AsyncRPCServer} -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} abstract class AmberProcessor( val actorId: ActorVirtualIdentity, diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/ProcessingStepCursor.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/ProcessingStepCursor.scala index c3a8d96f7fc..d9d32b752a7 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/ProcessingStepCursor.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/ProcessingStepCursor.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.engine.architecture.common import edu.uci.ics.amber.engine.architecture.common.ProcessingStepCursor.INIT_STEP -import edu.uci.ics.amber.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity object ProcessingStepCursor { // step value before processing any incoming message diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/WorkflowActor.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/WorkflowActor.scala index bf988f10ee6..898ab42444e 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/WorkflowActor.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/common/WorkflowActor.scala @@ -20,7 +20,7 @@ import edu.uci.ics.amber.engine.architecture.worker.WorkflowWorker.{ import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage import edu.uci.ics.amber.engine.common.{AmberLogging, CheckpointState} -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import scala.concurrent.Await import scala.concurrent.duration.DurationInt diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ClientEvent.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ClientEvent.scala index ea4e2137a88..2439ecaba16 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ClientEvent.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ClientEvent.scala @@ -4,7 +4,7 @@ import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregatedState import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessagePayload import edu.uci.ics.amber.engine.common.executionruntimestate.OperatorMetrics -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity trait ClientEvent extends WorkflowFIFOMessagePayload diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/Controller.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/Controller.scala index 80e58b86f6e..ca2fdb5477f 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/Controller.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/Controller.scala @@ -18,7 +18,7 @@ import edu.uci.ics.amber.engine.common.ambermessage.WorkflowMessage.getInMemSize import edu.uci.ics.amber.engine.common.ambermessage.{ControlPayload, WorkflowFIFOMessage} import edu.uci.ics.amber.engine.common.virtualidentity.util.{CLIENT, CONTROLLER, SELF} import edu.uci.ics.amber.engine.common.{AmberConfig, CheckpointState, SerializedState} -import edu.uci.ics.amber.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity import scala.concurrent.duration.DurationInt diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ControllerAsyncRPCHandlerInitializer.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ControllerAsyncRPCHandlerInitializer.scala index d30439cbc5a..634e603dc5a 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ControllerAsyncRPCHandlerInitializer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ControllerAsyncRPCHandlerInitializer.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.AsyncRPCContext import edu.uci.ics.amber.engine.architecture.rpc.controllerservice.ControllerServiceFs2Grpc import edu.uci.ics.amber.engine.common.AmberLogging import edu.uci.ics.amber.engine.common.rpc.AsyncRPCHandlerInitializer -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity class ControllerAsyncRPCHandlerInitializer( val cp: ControllerProcessor diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ControllerProcessor.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ControllerProcessor.scala index 215d18eb9d2..4d2ae3461a9 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ControllerProcessor.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/ControllerProcessor.scala @@ -12,7 +12,7 @@ import edu.uci.ics.amber.engine.architecture.logreplay.ReplayLogManager import edu.uci.ics.amber.engine.architecture.scheduling.WorkflowExecutionCoordinator import edu.uci.ics.amber.engine.architecture.worker.WorkflowWorker.MainThreadDelegateMessage import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity class ControllerProcessor( workflowContext: WorkflowContext, diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/GlobalReplayManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/GlobalReplayManager.scala index 83aae75afab..971151cb179 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/GlobalReplayManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/GlobalReplayManager.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.engine.architecture.controller -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/WorkflowScheduler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/WorkflowScheduler.scala index bec06fd7a93..eefe464d27e 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/WorkflowScheduler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/WorkflowScheduler.scala @@ -8,7 +8,7 @@ import edu.uci.ics.amber.engine.architecture.scheduling.{ Schedule } import edu.uci.ics.amber.engine.common.AmberConfig -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity class WorkflowScheduler( workflowContext: WorkflowContext, diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/LinkExecution.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/LinkExecution.scala index b9e05df7a50..e5e3ce34668 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/LinkExecution.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/LinkExecution.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.engine.architecture.controller.execution -import edu.uci.ics.amber.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/OperatorExecution.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/OperatorExecution.scala index 32df811be94..74d9d7441c6 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/OperatorExecution.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/OperatorExecution.scala @@ -5,8 +5,8 @@ import edu.uci.ics.amber.engine.architecture.deploysemantics.layer.WorkerExecuti import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregatedState import edu.uci.ics.amber.engine.architecture.worker.statistics.{PortTupleCountMapping, WorkerState} import edu.uci.ics.amber.engine.common.executionruntimestate.{OperatorMetrics, OperatorStatistics} -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import java.util import scala.jdk.CollectionConverters._ diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/RegionExecution.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/RegionExecution.scala index 53c28796f87..22701515f9e 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/RegionExecution.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/RegionExecution.scala @@ -5,8 +5,8 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregat import edu.uci.ics.amber.engine.architecture.scheduling.Region import edu.uci.ics.amber.engine.architecture.worker.statistics.WorkerStatistics import edu.uci.ics.amber.engine.common.executionruntimestate.OperatorMetrics -import edu.uci.ics.amber.virtualidentity.PhysicalOpIdentity -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.core.virtualidentity.PhysicalOpIdentity +import edu.uci.ics.amber.core.workflow.PhysicalLink import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/WorkflowExecution.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/WorkflowExecution.scala index cb945cd4942..0784621f8b0 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/WorkflowExecution.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/execution/WorkflowExecution.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregat import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregatedState._ import edu.uci.ics.amber.engine.architecture.scheduling.{Region, RegionIdentity} import edu.uci.ics.amber.engine.common.executionruntimestate.OperatorMetrics -import edu.uci.ics.amber.virtualidentity.PhysicalOpIdentity +import edu.uci.ics.amber.core.virtualidentity.PhysicalOpIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/ChannelMarkerHandler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/ChannelMarkerHandler.scala index 4f0efe392ac..a01ee8a6aa7 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/ChannelMarkerHandler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/ChannelMarkerHandler.scala @@ -13,7 +13,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.{ } import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER import edu.uci.ics.amber.util.VirtualIdentityUtils -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} trait ChannelMarkerHandler { this: ControllerAsyncRPCHandlerInitializer => diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/DebugCommandHandler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/DebugCommandHandler.scala index adeffba80ec..0be7f8faef7 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/DebugCommandHandler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/DebugCommandHandler.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.{ DebugCommandRequest } import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.EmptyReturn -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity trait DebugCommandHandler { this: ControllerAsyncRPCHandlerInitializer => diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/EvaluatePythonExpressionHandler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/EvaluatePythonExpressionHandler.scala index 394b11c0d86..d7d398912a4 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/EvaluatePythonExpressionHandler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/EvaluatePythonExpressionHandler.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.{ EvaluatePythonExpressionRequest } import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.EvaluatePythonExpressionResponse -import edu.uci.ics.amber.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity trait EvaluatePythonExpressionHandler { this: ControllerAsyncRPCHandlerInitializer => diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/PauseHandler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/PauseHandler.scala index b7d64982d8e..6dd48bcd21f 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/PauseHandler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/PauseHandler.scala @@ -9,7 +9,7 @@ import edu.uci.ics.amber.engine.architecture.controller.{ } import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.{AsyncRPCContext, EmptyRequest} import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.{EmptyReturn, WorkerMetricsResponse} -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/RetrieveWorkflowStateHandler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/RetrieveWorkflowStateHandler.scala index 0e3b5733497..766384125e9 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/RetrieveWorkflowStateHandler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/RetrieveWorkflowStateHandler.scala @@ -14,7 +14,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.{ } import edu.uci.ics.amber.engine.architecture.rpc.workerservice.WorkerServiceGrpc.METHOD_RETRIEVE_STATE import edu.uci.ics.amber.engine.common.virtualidentity.util.SELF -import edu.uci.ics.amber.virtualidentity.ChannelMarkerIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelMarkerIdentity import java.time.Instant diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/TakeGlobalCheckpointHandler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/TakeGlobalCheckpointHandler.scala index 3c7d9c7dd3c..2478f0ec89b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/TakeGlobalCheckpointHandler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/controller/promisehandlers/TakeGlobalCheckpointHandler.scala @@ -9,7 +9,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.workerservice.WorkerServiceGrpc import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage import edu.uci.ics.amber.engine.common.virtualidentity.util.SELF import edu.uci.ics.amber.engine.common.{CheckpointState, SerializedState} -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import java.net.URI diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/deploysemantics/layer/WorkerExecution.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/deploysemantics/layer/WorkerExecution.scala index 14ea621f56c..ab391e5adad 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/deploysemantics/layer/WorkerExecution.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/deploysemantics/layer/WorkerExecution.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.engine.architecture.deploysemantics.layer import edu.uci.ics.amber.engine.architecture.controller.execution.WorkerPortExecution import edu.uci.ics.amber.engine.architecture.worker.statistics.WorkerState.UNINITIALIZED import edu.uci.ics.amber.engine.architecture.worker.statistics.{WorkerState, WorkerStatistics} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/EmptyReplayLogger.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/EmptyReplayLogger.scala index 6864d56eb1c..47613a9c790 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/EmptyReplayLogger.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/EmptyReplayLogger.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.engine.architecture.logreplay import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage -import edu.uci.ics.amber.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} class EmptyReplayLogger extends ReplayLogger { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/OrderEnforcer.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/OrderEnforcer.scala index dc395293df7..7ff3a01dadd 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/OrderEnforcer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/OrderEnforcer.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.engine.architecture.logreplay -import edu.uci.ics.amber.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity trait OrderEnforcer { var isCompleted: Boolean diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogGenerator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogGenerator.scala index 7df42f20e4a..176914604b8 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogGenerator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogGenerator.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.logreplay import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage -import edu.uci.ics.amber.virtualidentity.ChannelMarkerIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelMarkerIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogManager.scala index a304754390a..3b9ca5dfa00 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogManager.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.engine.architecture.worker.WorkflowWorker.MainThreadDel import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage.SequentialRecordWriter import edu.uci.ics.amber.engine.common.storage.{EmptyRecordStorage, SequentialRecordStorage} -import edu.uci.ics.amber.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} //In-mem formats: sealed trait ReplayLogRecord diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogger.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogger.scala index 24858f56e45..f5faf1ae9c6 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogger.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLogger.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.engine.architecture.logreplay import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage -import edu.uci.ics.amber.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} abstract class ReplayLogger { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLoggerImpl.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLoggerImpl.scala index 6e7b8962a27..960a456b0af 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLoggerImpl.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayLoggerImpl.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.logreplay import edu.uci.ics.amber.engine.architecture.common.ProcessingStepCursor.INIT_STEP import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage -import edu.uci.ics.amber.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayOrderEnforcer.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayOrderEnforcer.scala index f62254b80d9..b48b1e20d55 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayOrderEnforcer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/logreplay/ReplayOrderEnforcer.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.engine.architecture.logreplay -import edu.uci.ics.amber.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/AmberFIFOChannel.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/AmberFIFOChannel.scala index 2cfc64fad9b..46e0845d3cb 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/AmberFIFOChannel.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/AmberFIFOChannel.scala @@ -3,8 +3,8 @@ package edu.uci.ics.amber.engine.architecture.messaginglayer import edu.uci.ics.amber.engine.common.AmberLogging import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage import edu.uci.ics.amber.engine.common.ambermessage.WorkflowMessage.getInMemSize -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.workflow.PortIdentity import java.util.concurrent.atomic.AtomicLong import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/InputGateway.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/InputGateway.scala index c64e82864c8..6c345c861c7 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/InputGateway.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/InputGateway.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.engine.architecture.messaginglayer import edu.uci.ics.amber.engine.architecture.logreplay.OrderEnforcer -import edu.uci.ics.amber.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity trait InputGateway { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/InputManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/InputManager.scala index 8f74e129ff8..7d479d86aa7 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/InputManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/InputManager.scala @@ -2,8 +2,8 @@ package edu.uci.ics.amber.engine.architecture.messaginglayer import edu.uci.ics.amber.core.tuple.{Schema, Tuple} import edu.uci.ics.amber.engine.common.AmberLogging -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.workflow.PortIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGateway.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGateway.scala index 25f4de433b1..b9f4c3ac5d8 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGateway.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGateway.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.messaginglayer import edu.uci.ics.amber.engine.architecture.logreplay.OrderEnforcer import edu.uci.ics.amber.engine.common.AmberLogging -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkOutputGateway.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkOutputGateway.scala index de074876934..f44e3c6b29a 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkOutputGateway.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkOutputGateway.scala @@ -8,7 +8,7 @@ import edu.uci.ics.amber.engine.common.ambermessage.{ WorkflowFIFOMessagePayload } import edu.uci.ics.amber.engine.common.virtualidentity.util.SELF -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import java.util.concurrent.atomic.AtomicLong import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManager.scala index d83eb0c1a57..e6ce0c84233 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManager.scala @@ -16,8 +16,8 @@ import edu.uci.ics.amber.engine.architecture.messaginglayer.OutputManager.{ import edu.uci.ics.amber.engine.architecture.sendsemantics.partitioners._ import edu.uci.ics.amber.engine.architecture.sendsemantics.partitionings._ import edu.uci.ics.amber.engine.common.AmberLogging -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} -import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/WorkerPort.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/WorkerPort.scala index fb75c05900f..4664f7b6d1e 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/WorkerPort.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/WorkerPort.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.engine.architecture.messaginglayer import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala index 6ef8f53e415..05cb7b0758a 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala @@ -21,7 +21,7 @@ import edu.uci.ics.amber.engine.common.actormessage.{ActorCommand, PythonActorMe import edu.uci.ics.amber.engine.common.ambermessage._ import edu.uci.ics.amber.engine.common.{AmberLogging, AmberRuntime} import edu.uci.ics.amber.util.ArrowUtils -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import org.apache.arrow.flight._ import org.apache.arrow.memory.{ArrowBuf, BufferAllocator, RootAllocator} import org.apache.arrow.vector.VectorSchemaRoot diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyServer.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyServer.scala index 523b9442800..c2b31717463 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyServer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyServer.scala @@ -12,7 +12,7 @@ import edu.uci.ics.amber.engine.common.ambermessage.ControlPayloadV2.Value.{ } import edu.uci.ics.amber.engine.common.ambermessage._ import edu.uci.ics.amber.util.ArrowUtils -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import org.apache.arrow.flight._ import org.apache.arrow.memory.{ArrowBuf, BufferAllocator, RootAllocator} import org.apache.arrow.util.AutoCloseables diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonWorkflowWorker.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonWorkflowWorker.scala index fcdfc0048b0..507f708afe7 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonWorkflowWorker.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonWorkflowWorker.scala @@ -15,7 +15,7 @@ import edu.uci.ics.amber.engine.common.actormessage.{Backpressure, CreditUpdate} import edu.uci.ics.amber.engine.common.ambermessage.WorkflowMessage.getInMemSize import edu.uci.ics.amber.engine.common.ambermessage._ import edu.uci.ics.amber.engine.common.{CheckpointState, Utils} -import edu.uci.ics.amber.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity import java.nio.file.Path import java.util.concurrent.{ExecutorService, Executors} diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/WorkerBatchInternalQueue.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/WorkerBatchInternalQueue.scala index 984cbf15262..485cf94c72b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/WorkerBatchInternalQueue.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/WorkerBatchInternalQueue.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.engine.architecture.pythonworker import edu.uci.ics.amber.engine.architecture.pythonworker.WorkerBatchInternalQueue._ import edu.uci.ics.amber.engine.common.actormessage.ActorCommand import edu.uci.ics.amber.engine.common.ambermessage.{ControlPayload, DataFrame, DataPayload} -import edu.uci.ics.amber.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity import lbmq.LinkedBlockingMultiQueue import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGenerator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGenerator.scala index ca2c5553633..457bd15e6ab 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGenerator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGenerator.scala @@ -2,8 +2,8 @@ package edu.uci.ics.amber.engine.architecture.scheduling import edu.uci.ics.amber.core.workflow.{PhysicalPlan, WorkflowContext} import edu.uci.ics.amber.engine.common.{AmberConfig, AmberLogging} -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, PhysicalOpIdentity} -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, PhysicalOpIdentity} +import edu.uci.ics.amber.core.workflow.PhysicalLink import org.jgrapht.alg.connectivity.BiconnectivityInspector import org.jgrapht.graph.{DirectedAcyclicGraph, DirectedPseudograph} diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGenerator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGenerator.scala index 12206488b55..2457d09b4cd 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGenerator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGenerator.scala @@ -3,8 +3,8 @@ package edu.uci.ics.amber.engine.architecture.scheduling import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.WorkflowRuntimeException import edu.uci.ics.amber.core.workflow.{PhysicalPlan, WorkflowContext} -import edu.uci.ics.amber.virtualidentity.PhysicalOpIdentity -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.core.virtualidentity.PhysicalOpIdentity +import edu.uci.ics.amber.core.workflow.PhysicalLink import org.jgrapht.alg.connectivity.BiconnectivityInspector import org.jgrapht.graph.DirectedAcyclicGraph diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/Region.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/Region.scala index aae121e6fcc..f90bae23cc2 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/Region.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/Region.scala @@ -2,8 +2,8 @@ package edu.uci.ics.amber.engine.architecture.scheduling import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.engine.architecture.scheduling.config.ResourceConfig -import edu.uci.ics.amber.virtualidentity.PhysicalOpIdentity -import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.PhysicalOpIdentity +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import org.jgrapht.graph.{DefaultEdge, DirectedAcyclicGraph} import org.jgrapht.traverse.TopologicalOrderIterator diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionExecutionCoordinator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionExecutionCoordinator.scala index 283c3b843bf..8567e0f17bb 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionExecutionCoordinator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionExecutionCoordinator.scala @@ -28,7 +28,7 @@ import edu.uci.ics.amber.engine.architecture.scheduling.config.{OperatorConfig, import edu.uci.ics.amber.engine.common.AmberRuntime import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.core.workflow.PhysicalLink class RegionExecutionCoordinator( region: Region, diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionPlan.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionPlan.scala index fd7c5717045..331bad87884 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionPlan.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionPlan.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.engine.architecture.scheduling -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.core.workflow.PhysicalLink import org.jgrapht.graph.DirectedAcyclicGraph import org.jgrapht.traverse.TopologicalOrderIterator diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala index 5728f197dbc..cc4a84e1a24 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/ScheduleGenerator.scala @@ -8,8 +8,8 @@ import edu.uci.ics.amber.engine.architecture.scheduling.resourcePolicies.{ ExecutionClusterInfo } import edu.uci.ics.amber.operator.SpecialPhysicalOpFactory -import edu.uci.ics.amber.virtualidentity.PhysicalOpIdentity -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.core.virtualidentity.PhysicalOpIdentity +import edu.uci.ics.amber.core.workflow.PhysicalLink import org.jgrapht.graph.DirectedAcyclicGraph import org.jgrapht.traverse.TopologicalOrderIterator diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/WorkflowExecutionCoordinator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/WorkflowExecutionCoordinator.scala index 937b7c5f764..e1bcb2289e9 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/WorkflowExecutionCoordinator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/WorkflowExecutionCoordinator.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.engine.architecture.common.AkkaActorService import edu.uci.ics.amber.engine.architecture.controller.ControllerConfig import edu.uci.ics.amber.engine.architecture.controller.execution.WorkflowExecution import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.core.workflow.PhysicalLink import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/ChannelConfig.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/ChannelConfig.scala index 73944014928..3e5ef3d2d01 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/ChannelConfig.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/ChannelConfig.scala @@ -9,8 +9,8 @@ import edu.uci.ics.amber.core.workflow.{ SinglePartition, UnknownPartition } -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.workflow.PortIdentity case object ChannelConfig { def generateChannelConfigs( diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/LinkConfig.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/LinkConfig.scala index d31f9045949..a34e7d7e5a1 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/LinkConfig.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/LinkConfig.scala @@ -10,7 +10,7 @@ import edu.uci.ics.amber.core.workflow.{ UnknownPartition } import edu.uci.ics.amber.engine.architecture.sendsemantics.partitionings._ -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} case object LinkConfig { def toPartitioning( diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/ResourceConfig.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/ResourceConfig.scala index 311c35308ac..fd91ccac385 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/ResourceConfig.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/ResourceConfig.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.engine.architecture.scheduling.config -import edu.uci.ics.amber.virtualidentity.PhysicalOpIdentity -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.core.virtualidentity.PhysicalOpIdentity +import edu.uci.ics.amber.core.workflow.PhysicalLink case class ResourceConfig( operatorConfigs: Map[PhysicalOpIdentity, OperatorConfig], diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/WorkerConfig.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/WorkerConfig.scala index bc0a9457bf9..84868acc335 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/WorkerConfig.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/config/WorkerConfig.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.engine.architecture.scheduling.config import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.engine.common.AmberConfig import edu.uci.ics.amber.util.VirtualIdentityUtils -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity case object WorkerConfig { def generateWorkerConfigs(physicalOp: PhysicalOp): List[WorkerConfig] = { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/resourcePolicies/ResourceAllocator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/resourcePolicies/ResourceAllocator.scala index f44b04577ac..255d4177760 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/resourcePolicies/ResourceAllocator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/resourcePolicies/ResourceAllocator.scala @@ -10,8 +10,8 @@ import edu.uci.ics.amber.engine.architecture.scheduling.config.{ OperatorConfig, ResourceConfig } -import edu.uci.ics.amber.virtualidentity.PhysicalOpIdentity -import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.PhysicalOpIdentity +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/BroadcastPartitioner.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/BroadcastPartitioner.scala index 724007d8e16..84227f829d8 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/BroadcastPartitioner.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/BroadcastPartitioner.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.sendsemantics.partitioners import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.engine.architecture.sendsemantics.partitionings.BroadcastPartitioning -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity case class BroadcastPartitioner(partitioning: BroadcastPartitioning) extends Partitioner { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/HashBasedShufflePartitioner.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/HashBasedShufflePartitioner.scala index 80f79a4e6e7..355c8a51853 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/HashBasedShufflePartitioner.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/HashBasedShufflePartitioner.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.sendsemantics.partitioners import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.engine.architecture.sendsemantics.partitionings.HashBasedShufflePartitioning -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity case class HashBasedShufflePartitioner(partitioning: HashBasedShufflePartitioning) extends Partitioner { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/OneToOnePartitioner.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/OneToOnePartitioner.scala index 4c1df816d66..e4e9f24e749 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/OneToOnePartitioner.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/OneToOnePartitioner.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.sendsemantics.partitioners import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.engine.architecture.sendsemantics.partitionings.OneToOnePartitioning -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity case class OneToOnePartitioner(partitioning: OneToOnePartitioning, actorId: ActorVirtualIdentity) extends Partitioner { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/Partitioner.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/Partitioner.scala index ba447481878..ab3c0149a92 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/Partitioner.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/Partitioner.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.engine.architecture.messaginglayer.NetworkOutputGateway import edu.uci.ics.amber.engine.common.AmberConfig import edu.uci.ics.amber.engine.common.ambermessage.{DataFrame, MarkerFrame} -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import scala.collection.mutable.ArrayBuffer diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/RangeBasedShufflePartitioner.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/RangeBasedShufflePartitioner.scala index cb78202723c..6f549287c7b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/RangeBasedShufflePartitioner.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/RangeBasedShufflePartitioner.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.sendsemantics.partitioners import edu.uci.ics.amber.core.tuple.{AttributeType, Tuple} import edu.uci.ics.amber.engine.architecture.sendsemantics.partitionings.RangeBasedShufflePartitioning -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity case class RangeBasedShufflePartitioner(partitioning: RangeBasedShufflePartitioning) extends Partitioner { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/RoundRobinPartitioner.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/RoundRobinPartitioner.scala index 1918ea6ea0b..c4125f6ce04 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/RoundRobinPartitioner.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/sendsemantics/partitioners/RoundRobinPartitioner.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.sendsemantics.partitioners import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.engine.architecture.sendsemantics.partitionings.RoundRobinPartitioning -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity case class RoundRobinPartitioner(partitioning: RoundRobinPartitioning) extends Partitioner { private var roundRobinIndex = 0 diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/ChannelMarkerManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/ChannelMarkerManager.scala index 5220eaa15e9..5aabc90c466 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/ChannelMarkerManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/ChannelMarkerManager.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.ChannelMarkerTy REQUIRE_ALIGNMENT } import edu.uci.ics.amber.engine.common.{AmberLogging, CheckpointState} -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, ChannelIdentity, ChannelMarkerIdentity diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DPThread.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DPThread.scala index 2bdda981f67..5c330438405 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DPThread.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DPThread.scala @@ -16,7 +16,7 @@ import edu.uci.ics.amber.engine.common.ambermessage.{ } import edu.uci.ics.amber.engine.common.virtualidentity.util.SELF import edu.uci.ics.amber.error.ErrorUtils.safely -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import java.util.concurrent._ diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessor.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessor.scala index 420c9cefffc..f9f3621bbe8 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessor.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessor.scala @@ -31,8 +31,8 @@ import edu.uci.ics.amber.engine.common.ambermessage._ import edu.uci.ics.amber.engine.common.statetransition.WorkerStateManager import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER import edu.uci.ics.amber.error.ErrorUtils.{mkConsoleMessage, safely} -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.workflow.PortIdentity class DataProcessor( actorId: ActorVirtualIdentity, diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorRPCHandlerInitializer.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorRPCHandlerInitializer.scala index a0df688852f..c96a9b24323 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorRPCHandlerInitializer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorRPCHandlerInitializer.scala @@ -12,7 +12,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.workerservice.WorkerServiceFs2G import edu.uci.ics.amber.engine.architecture.worker.promisehandlers._ import edu.uci.ics.amber.engine.common.AmberLogging import edu.uci.ics.amber.engine.common.rpc.AsyncRPCHandlerInitializer -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity class DataProcessorRPCHandlerInitializer(val dp: DataProcessor) extends AsyncRPCHandlerInitializer(dp.asyncRPCClient, dp.asyncRPCServer) diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/PauseManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/PauseManager.scala index bdaf5b7f800..078aeebd696 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/PauseManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/PauseManager.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.worker import edu.uci.ics.amber.engine.architecture.messaginglayer.InputGateway import edu.uci.ics.amber.engine.common.AmberLogging -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/PauseType.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/PauseType.scala index 063646941c8..8106c8a9e7b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/PauseType.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/PauseType.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.engine.architecture.worker -import edu.uci.ics.amber.virtualidentity.ChannelMarkerIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelMarkerIdentity sealed trait PauseType diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/WorkflowWorker.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/WorkflowWorker.scala index 5df91c4f138..4aeaed555ce 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/WorkflowWorker.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/WorkflowWorker.scala @@ -12,7 +12,7 @@ import edu.uci.ics.amber.engine.common.actormessage.{ActorCommand, Backpressure} import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage import edu.uci.ics.amber.engine.common.ambermessage.WorkflowMessage.getInMemSize import edu.uci.ics.amber.engine.common.{CheckpointState, SerializedState} -import edu.uci.ics.amber.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ChannelIdentity, ChannelMarkerIdentity} import java.net.URI import java.util.concurrent.LinkedBlockingQueue diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/SerializationManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/SerializationManager.scala index ae9ebb2393b..a6ea4f74e4b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/SerializationManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/SerializationManager.scala @@ -11,8 +11,8 @@ import edu.uci.ics.amber.engine.common.{ CheckpointSupport } import edu.uci.ics.amber.util.VirtualIdentityUtils -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity class SerializationManager(val actorId: ActorVirtualIdentity) extends AmberLogging { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/StatisticsManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/StatisticsManager.scala index 5be4702372c..fa0d437977b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/StatisticsManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/StatisticsManager.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.engine.architecture.worker.statistics.{ PortTupleCountMapping, WorkerStatistics } -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/PrepareCheckpointHandler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/PrepareCheckpointHandler.scala index 1b9fa274f20..6c00bcbd43f 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/PrepareCheckpointHandler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/PrepareCheckpointHandler.scala @@ -13,7 +13,7 @@ import edu.uci.ics.amber.engine.architecture.worker.{ } import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage import edu.uci.ics.amber.engine.common.{CheckpointState, CheckpointSupport, SerializedState} -import edu.uci.ics.amber.virtualidentity.ChannelMarkerIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelMarkerIdentity import java.util.concurrent.CompletableFuture import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/StartHandler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/StartHandler.scala index 066271c3509..316f70cba0a 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/StartHandler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/StartHandler.scala @@ -10,8 +10,8 @@ import edu.uci.ics.amber.engine.architecture.worker.DataProcessorRPCHandlerIniti import edu.uci.ics.amber.engine.architecture.worker.statistics.WorkerState.{READY, RUNNING} import edu.uci.ics.amber.engine.common.ambermessage.MarkerFrame import edu.uci.ics.amber.engine.common.virtualidentity.util.SOURCE_STARTER_ACTOR -import edu.uci.ics.amber.virtualidentity.ChannelIdentity -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity trait StartHandler { this: DataProcessorRPCHandlerInitializer => diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/AmberLogging.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/AmberLogging.scala index d437c5e3217..3844cf59350 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/AmberLogging.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/AmberLogging.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.common import com.typesafe.scalalogging.Logger import edu.uci.ics.amber.util.VirtualIdentityUtils -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import org.slf4j.LoggerFactory trait AmberLogging { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/CheckpointSupport.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/CheckpointSupport.scala index f4e5fabdffa..aab7c9daf7b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/CheckpointSupport.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/CheckpointSupport.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.engine.common import edu.uci.ics.amber.core.tuple.TupleLike -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity trait CheckpointSupport { def serializeState( diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/ambermessage/RecoveryPayload.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/ambermessage/RecoveryPayload.scala index f23c49ae9a1..f883d772bf6 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/ambermessage/RecoveryPayload.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/ambermessage/RecoveryPayload.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.engine.common.ambermessage import akka.actor.{ActorRef, Address} -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity sealed trait RecoveryPayload extends Serializable {} diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/ambermessage/WorkflowMessage.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/ambermessage/WorkflowMessage.scala index 1b52484130b..5a8b68687b2 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/ambermessage/WorkflowMessage.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/ambermessage/WorkflowMessage.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.engine.common.ambermessage -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} case object WorkflowMessage { def getInMemSize(msg: WorkflowMessage): Long = { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/client/ClientActor.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/client/ClientActor.scala index 266d9a94b42..83b9914465f 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/client/ClientActor.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/client/ClientActor.scala @@ -34,7 +34,7 @@ import edu.uci.ics.amber.engine.common.client.ClientActor.{ import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient import edu.uci.ics.amber.engine.common.virtualidentity.util.{CLIENT, CONTROLLER} import edu.uci.ics.amber.error.ErrorUtils.reconstructThrowable -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCClient.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCClient.scala index a66f638d238..b7f8454e317 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCClient.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCClient.scala @@ -16,7 +16,7 @@ import edu.uci.ics.amber.engine.common.AmberLogging import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient.createProxy import edu.uci.ics.amber.engine.common.virtualidentity.util.CLIENT import edu.uci.ics.amber.error.ErrorUtils.reconstructThrowable -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, ChannelIdentity, ChannelMarkerIdentity diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCHandlerInitializer.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCHandlerInitializer.scala index 2af64655404..4205027fda4 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCHandlerInitializer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCHandlerInitializer.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands._ import edu.uci.ics.amber.engine.architecture.rpc.controllerservice.ControllerServiceFs2Grpc import edu.uci.ics.amber.engine.architecture.rpc.controlreturns._ import edu.uci.ics.amber.engine.architecture.rpc.workerservice.WorkerServiceFs2Grpc -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, ChannelIdentity, ChannelMarkerIdentity diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCServer.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCServer.scala index dfdc056ccbd..eae8f455049 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCServer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCServer.scala @@ -10,7 +10,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.{ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.{ControlReturn, ReturnInvocation} import edu.uci.ics.amber.engine.common.AmberLogging import edu.uci.ics.amber.error.ErrorUtils.mkControlError -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import java.lang.reflect.Method import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/statetransition/StateManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/statetransition/StateManager.scala index db54fff11ff..d346497ec39 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/statetransition/StateManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/statetransition/StateManager.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.engine.common.statetransition.StateManager.{ InvalidStateException, InvalidTransitionException } -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity object StateManager { diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/statetransition/WorkerStateManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/statetransition/WorkerStateManager.scala index 50383e6e618..199925d91ef 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/statetransition/WorkerStateManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/statetransition/WorkerStateManager.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.common.statetransition import edu.uci.ics.amber.engine.architecture.worker.statistics.WorkerState import edu.uci.ics.amber.engine.architecture.worker.statistics.WorkerState._ -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity // The following pattern is a good practice of enum in scala // We've always used this pattern in the codebase diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/virtualidentity/util.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/virtualidentity/util.scala index 2ccdf56bb85..39a2ea407f4 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/virtualidentity/util.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/virtualidentity/util.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.engine.common.virtualidentity -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, OperatorIdentity, PhysicalOpIdentity diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/error/ErrorUtils.scala b/core/amber/src/main/scala/edu/uci/ics/amber/error/ErrorUtils.scala index b359fe15910..c1569587fbb 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/error/ErrorUtils.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/error/ErrorUtils.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.ConsoleMessage import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.ConsoleMessageType.ERROR import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.{ControlError, ErrorLanguage} import edu.uci.ics.amber.util.VirtualIdentityUtils -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import java.time.Instant import scala.util.control.ControlThrowable diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/ComputingUnitMaster.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/ComputingUnitMaster.scala index eba2c81df24..f641440c643 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/ComputingUnitMaster.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/ComputingUnitMaster.scala @@ -15,7 +15,7 @@ import edu.uci.ics.amber.engine.common.Utils.{maptoStatusCode, objectMapper} import edu.uci.ics.amber.engine.common.client.AmberClient import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage import edu.uci.ics.amber.engine.common.{AmberConfig, AmberRuntime, Utils} -import edu.uci.ics.amber.virtualidentity.ExecutionIdentity +import edu.uci.ics.amber.core.virtualidentity.ExecutionIdentity import edu.uci.ics.texera.web.auth.JwtAuth.setupJwtAuth import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.WorkflowExecutions import edu.uci.ics.texera.web.resource.WorkflowWebsocketResource diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/websocket/event/WorkflowErrorEvent.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/websocket/event/WorkflowErrorEvent.scala index a89dbc98567..bede408b24f 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/model/websocket/event/WorkflowErrorEvent.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/model/websocket/event/WorkflowErrorEvent.scala @@ -1,6 +1,6 @@ package edu.uci.ics.texera.web.model.websocket.event -import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError +import edu.uci.ics.amber.core.workflowruntimestate.WorkflowFatalError case class WorkflowErrorEvent( fatalErrors: Seq[WorkflowFatalError] diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/WorkflowWebsocketResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/WorkflowWebsocketResource.scala index a14843596be..badece30c22 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/WorkflowWebsocketResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/WorkflowWebsocketResource.scala @@ -5,9 +5,9 @@ import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.clustering.ClusterListener import edu.uci.ics.amber.engine.common.Utils.objectMapper import edu.uci.ics.amber.error.ErrorUtils.getStackTraceWithAllCauses -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity -import edu.uci.ics.amber.workflowruntimestate.FatalErrorType.COMPILATION_ERROR -import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError +import edu.uci.ics.amber.core.virtualidentity.WorkflowIdentity +import edu.uci.ics.amber.core.workflowruntimestate.FatalErrorType.COMPILATION_ERROR +import edu.uci.ics.amber.core.workflowruntimestate.WorkflowFatalError import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.model.websocket.event.{WorkflowErrorEvent, WorkflowStateEvent} import edu.uci.ics.texera.web.model.websocket.request._ diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala index 8758fab0b85..132e9bdd7b3 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala @@ -3,7 +3,7 @@ package edu.uci.ics.texera.web.resource.dashboard.user.workflow import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.amber.engine.architecture.logreplay.{ReplayDestination, ReplayLogRecord} import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage -import edu.uci.ics.amber.virtualidentity.{ChannelMarkerIdentity, ExecutionIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ChannelMarkerIdentity, ExecutionIdentity} import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.web.auth.SessionUser import edu.uci.ics.texera.dao.jooq.generated.Tables.{ diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionConsoleService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionConsoleService.scala index 3013f0bd19a..19fb357a73a 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionConsoleService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionConsoleService.scala @@ -16,7 +16,7 @@ import edu.uci.ics.amber.engine.common.executionruntimestate.{ OperatorConsole } import edu.uci.ics.amber.util.VirtualIdentityUtils -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import edu.uci.ics.texera.web.model.websocket.event.TexeraWebSocketEvent import edu.uci.ics.texera.web.model.websocket.event.python.ConsoleUpdateEvent import edu.uci.ics.texera.web.model.websocket.request.RetryRequest diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala index 70a21bba475..11c0002a311 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionResultService.scala @@ -19,9 +19,9 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregat import edu.uci.ics.amber.engine.common.client.AmberClient import edu.uci.ics.amber.engine.common.executionruntimestate.ExecutionMetadataStore import edu.uci.ics.amber.engine.common.{AmberConfig, AmberRuntime} -import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.{OperatorIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.texera.web.SubscriptionManager import edu.uci.ics.texera.web.model.websocket.event.{ PaginatedResultEvent, diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionRuntimeService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionRuntimeService.scala index 74fba31a03e..db7fa09b75f 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionRuntimeService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionRuntimeService.scala @@ -9,7 +9,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.{ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregatedState._ import edu.uci.ics.amber.engine.architecture.worker.WorkflowWorker.FaultToleranceConfig import edu.uci.ics.amber.engine.common.client.AmberClient -import edu.uci.ics.amber.virtualidentity.ChannelMarkerIdentity +import edu.uci.ics.amber.core.virtualidentity.ChannelMarkerIdentity import edu.uci.ics.texera.web.model.websocket.request._ import edu.uci.ics.texera.web.storage.ExecutionStateStore import edu.uci.ics.texera.web.storage.ExecutionStateStore.updateWorkflowState diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionStatsService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionStatsService.scala index 38a1098c58e..a333d16d060 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionStatsService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionStatsService.scala @@ -21,8 +21,8 @@ import edu.uci.ics.amber.engine.common.executionruntimestate.{ } import edu.uci.ics.amber.engine.common.{AmberConfig, Utils} import edu.uci.ics.amber.error.ErrorUtils.{getOperatorFromActorIdOpt, getStackTraceWithAllCauses} -import edu.uci.ics.amber.workflowruntimestate.FatalErrorType.EXECUTION_FAILURE -import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError +import edu.uci.ics.amber.core.workflowruntimestate.FatalErrorType.EXECUTION_FAILURE +import edu.uci.ics.amber.core.workflowruntimestate.WorkflowFatalError import edu.uci.ics.texera.web.SubscriptionManager import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.WorkflowRuntimeStatistics import edu.uci.ics.texera.web.model.websocket.event.{ diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionsMetadataPersistService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionsMetadataPersistService.scala index d8b259acdaa..0016b89a480 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionsMetadataPersistService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ExecutionsMetadataPersistService.scala @@ -4,7 +4,7 @@ import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.amber.core.workflow.WorkflowContext.DEFAULT_EXECUTION_ID import edu.uci.ics.amber.engine.common.AmberConfig -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.texera.dao.SqlServer import edu.uci.ics.texera.dao.jooq.generated.tables.daos.WorkflowExecutionsDao import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.WorkflowExecutions diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/FriesReconfigurationAlgorithm.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/FriesReconfigurationAlgorithm.scala index 12b7c27a53b..85113b3a5ad 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/FriesReconfigurationAlgorithm.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/FriesReconfigurationAlgorithm.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.{ PropagateChannelMarkerRequest } import edu.uci.ics.amber.engine.architecture.scheduling.{Region, WorkflowExecutionCoordinator} -import edu.uci.ics.amber.virtualidentity.PhysicalOpIdentity +import edu.uci.ics.amber.core.virtualidentity.PhysicalOpIdentity import org.jgrapht.alg.connectivity.ConnectivityInspector import scala.collection.mutable diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala index 83e3d89f347..7d112f30afc 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/ResultExportService.scala @@ -12,7 +12,7 @@ import edu.uci.ics.amber.core.storage.result.{OpResultStorage, ResultStorage} import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.engine.common.Utils.retry import edu.uci.ics.amber.util.PathUtils -import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{OperatorIdentity, WorkflowIdentity} import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.model.websocket.request.ResultExportRequest import edu.uci.ics.texera.web.model.websocket.response.ResultExportResponse @@ -24,7 +24,7 @@ import edu.uci.ics.texera.web.resource.dashboard.user.dataset.DatasetResource.{ import edu.uci.ics.texera.web.resource.dashboard.user.workflow.WorkflowVersionResource import org.jooq.types.UInteger import edu.uci.ics.amber.util.ArrowUtils -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import java.io.{PipedInputStream, PipedOutputStream} import java.nio.charset.StandardCharsets diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala index 6f0b7868e54..9610e00c3a4 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/service/WorkflowService.scala @@ -16,13 +16,13 @@ import edu.uci.ics.amber.engine.architecture.worker.WorkflowWorker.{ } import edu.uci.ics.amber.engine.common.AmberConfig import edu.uci.ics.amber.error.ErrorUtils.{getOperatorFromActorIdOpt, getStackTraceWithAllCauses} -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ChannelMarkerIdentity, ExecutionIdentity, WorkflowIdentity } -import edu.uci.ics.amber.workflowruntimestate.FatalErrorType.EXECUTION_FAILURE -import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError +import edu.uci.ics.amber.core.workflowruntimestate.FatalErrorType.EXECUTION_FAILURE +import edu.uci.ics.amber.core.workflowruntimestate.WorkflowFatalError import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.User import edu.uci.ics.texera.web.model.websocket.event.TexeraWebSocketEvent import edu.uci.ics.texera.web.model.websocket.request.WorkflowExecuteRequest diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/storage/ExecutionReconfigurationStore.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/storage/ExecutionReconfigurationStore.scala index aa192f98141..ea5d44c722f 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/storage/ExecutionReconfigurationStore.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/storage/ExecutionReconfigurationStore.scala @@ -2,7 +2,7 @@ package edu.uci.ics.texera.web.storage import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.StateTransferFunc -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity case class ExecutionReconfigurationStore( currentReconfigId: Option[String] = None, diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalLink.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalLink.scala index 7e4959d972e..d9123ece957 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalLink.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalLink.scala @@ -1,8 +1,8 @@ package edu.uci.ics.texera.workflow import com.fasterxml.jackson.annotation.{JsonCreator, JsonProperty} -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity case class LogicalLink( @JsonProperty("fromOpId") fromOpId: OperatorIdentity, diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala index f9475fd6bb9..a76b4b589af 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala @@ -4,7 +4,7 @@ import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.storage.FileResolver import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc -import edu.uci.ics.amber.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity import edu.uci.ics.texera.web.model.websocket.request.LogicalPlanPojo import org.jgrapht.graph.DirectedAcyclicGraph import org.jgrapht.util.SupplierUtil diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala index 795939f1ea0..6c13eadbffe 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala @@ -6,9 +6,9 @@ import edu.uci.ics.amber.core.workflow.{PhysicalPlan, WorkflowContext} import edu.uci.ics.amber.engine.architecture.controller.Workflow import edu.uci.ics.amber.engine.common.Utils.objectMapper import edu.uci.ics.amber.operator.SpecialPhysicalOpFactory -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.OutputPort.OutputMode.SINGLE_SNAPSHOT -import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode.SINGLE_SNAPSHOT +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import edu.uci.ics.texera.web.model.websocket.request.LogicalPlanPojo import edu.uci.ics.texera.web.service.ExecutionsMetadataPersistService diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/TrivialControlSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/TrivialControlSpec.scala index f24fa17becb..0fe1c9aafb3 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/TrivialControlSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/TrivialControlSpec.scala @@ -20,7 +20,7 @@ import edu.uci.ics.amber.engine.common.ambermessage.WorkflowFIFOMessage import edu.uci.ics.amber.engine.common.ambermessage.WorkflowMessage.getInMemSize import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient.ControlInvocation import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import io.grpc.MethodDescriptor import org.scalatest.wordspec.AnyWordSpecLike import org.scalatest.{BeforeAndAfterAll, BeforeAndAfterEach} diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/MultiCallHandler.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/MultiCallHandler.scala index b1fdb1c46bb..b8f8bbef476 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/MultiCallHandler.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/MultiCallHandler.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.engine.architecture.control.utils import com.twitter.util.Future import edu.uci.ics.amber.engine.architecture.rpc.controlcommands._ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns._ -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity trait MultiCallHandler { this: TesterAsyncRPCHandlerInitializer => diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/TesterAsyncRPCHandlerInitializer.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/TesterAsyncRPCHandlerInitializer.scala index 4b1d0130c96..5ddbbc2348f 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/TesterAsyncRPCHandlerInitializer.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/TesterAsyncRPCHandlerInitializer.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.engine.architecture.control.utils.TrivialControlTester. import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.AsyncRPCContext import edu.uci.ics.amber.engine.architecture.rpc.testerservice.RPCTesterFs2Grpc import edu.uci.ics.amber.engine.common.rpc.{AsyncRPCHandlerInitializer, AsyncRPCServer} -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity class TesterAsyncRPCHandlerInitializer( val myID: ActorVirtualIdentity, diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/TrivialControlTester.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/TrivialControlTester.scala index 1cf91bc9f21..cef7469114c 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/TrivialControlTester.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/control/utils/TrivialControlTester.scala @@ -15,7 +15,7 @@ import edu.uci.ics.amber.engine.common.ambermessage.{ WorkflowFIFOMessage } import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} object TrivialControlTester { class ControlTesterRPCClient(outputGateway: NetworkOutputGateway, actorId: ActorVirtualIdentity) diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGatewaySpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGatewaySpec.scala index 18a401444ac..2203ff7f2a9 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGatewaySpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGatewaySpec.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.engine.architecture.messaginglayer import edu.uci.ics.amber.core.tuple.{AttributeType, Schema, TupleLike} import edu.uci.ics.amber.engine.common.ambermessage.{DataFrame, WorkflowFIFOMessage} -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import org.scalamock.scalatest.MockFactory import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManagerSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManagerSpec.scala index 0cfb9c4f753..916f6cf3e81 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManagerSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManagerSpec.scala @@ -5,13 +5,13 @@ import edu.uci.ics.amber.core.marker.EndOfInputChannel import edu.uci.ics.amber.core.tuple.{AttributeType, Schema, TupleLike} import edu.uci.ics.amber.engine.architecture.sendsemantics.partitionings.OneToOnePartitioning import edu.uci.ics.amber.engine.common.ambermessage._ -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, ChannelIdentity, OperatorIdentity, PhysicalOpIdentity } -import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import org.scalamock.scalatest.MockFactory import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/RangeBasedShuffleSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/RangeBasedShuffleSpec.scala index 721eb6f509c..59034227074 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/RangeBasedShuffleSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/RangeBasedShuffleSpec.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.engine.architecture.messaginglayer import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema, Tuple} import edu.uci.ics.amber.engine.architecture.sendsemantics.partitioners.RangeBasedShufflePartitioner import edu.uci.ics.amber.engine.architecture.sendsemantics.partitionings.RangeBasedShufflePartitioning -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import org.scalamock.scalatest.MockFactory import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonWorkflowWorkerSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonWorkflowWorkerSpec.scala index 3e2a3642ae9..9a5ae6c99c5 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonWorkflowWorkerSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonWorkflowWorkerSpec.scala @@ -21,7 +21,7 @@ //} //import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient.{ControlInvocation, ReturnInvocation} //import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER -//import edu.uci.ics.amber.virtualidentity.{ +//import edu.uci.ics.amber.core.virtualidentity.{ // ActorVirtualIdentity, // PhysicalLink, // PhysicalLink, diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGeneratorSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGeneratorSpec.scala index d67524555e7..708e8ff7ad4 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGeneratorSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGeneratorSpec.scala @@ -4,7 +4,7 @@ import edu.uci.ics.amber.core.workflow.WorkflowContext import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER import edu.uci.ics.amber.engine.e2e.TestUtils.buildWorkflow import edu.uci.ics.amber.operator.TestOperators -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.texera.workflow.LogicalLink import org.scalamock.scalatest.MockFactory import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGeneratorSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGeneratorSpec.scala index c28c3265a20..93363e1fdf4 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGeneratorSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/ExpansionGreedyScheduleGeneratorSpec.scala @@ -5,8 +5,8 @@ import edu.uci.ics.amber.engine.e2e.TestUtils.buildWorkflow import edu.uci.ics.amber.operator.TestOperators import edu.uci.ics.amber.operator.split.SplitOpDesc import edu.uci.ics.amber.operator.udf.python.{DualInputPortsPythonUDFOpDescV2, PythonUDFOpDescV2} -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.texera.workflow.LogicalLink import org.scalamock.scalatest.MockFactory import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DPThreadSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DPThreadSpec.scala index 1eac2a1d909..8c8cedefcb7 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DPThreadSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DPThreadSpec.scala @@ -18,8 +18,8 @@ import edu.uci.ics.amber.engine.common.ambermessage.{DataFrame, WorkflowFIFOMess import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient.ControlInvocation import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage import edu.uci.ics.amber.engine.common.virtualidentity.util.SELF -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.workflow.PortIdentity import org.scalamock.scalatest.MockFactory import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorSpec.scala index 0b9f22823ca..a3b62cfabca 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorSpec.scala @@ -16,13 +16,13 @@ import edu.uci.ics.amber.engine.common.ambermessage.{DataFrame, MarkerFrame, Wor import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient.ControlInvocation import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER import edu.uci.ics.amber.util.VirtualIdentityUtils -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, ChannelIdentity, OperatorIdentity, PhysicalOpIdentity } -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import org.scalamock.scalatest.MockFactory import org.scalatest.BeforeAndAfterEach import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala index f723a9bd7af..aab2548242a 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala @@ -20,13 +20,13 @@ import edu.uci.ics.amber.engine.common.AmberRuntime import edu.uci.ics.amber.engine.common.ambermessage.{DataFrame, DataPayload, WorkflowFIFOMessage} import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, ChannelIdentity, OperatorIdentity, PhysicalOpIdentity } -import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import org.scalamock.scalatest.MockFactory import org.scalatest.BeforeAndAfterAll import org.scalatest.flatspec.AnyFlatSpecLike diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/BatchSizePropagationSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/BatchSizePropagationSpec.scala index a62d37ca284..9eca6949048 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/BatchSizePropagationSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/BatchSizePropagationSpec.scala @@ -12,7 +12,7 @@ import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER import edu.uci.ics.amber.engine.e2e.TestUtils.buildWorkflow import edu.uci.ics.amber.operator.TestOperators import edu.uci.ics.amber.operator.aggregate.AggregationFunction -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.texera.workflow.LogicalLink import org.scalatest.flatspec.AnyFlatSpecLike import org.scalatest.{BeforeAndAfterAll, BeforeAndAfterEach} diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala index 2f6f8ab67d5..9ef3a1ae101 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/DataProcessingSpec.scala @@ -18,8 +18,8 @@ import edu.uci.ics.amber.engine.common.client.AmberClient import edu.uci.ics.amber.engine.e2e.TestUtils.buildWorkflow import edu.uci.ics.amber.operator.TestOperators import edu.uci.ics.amber.operator.aggregate.AggregationFunction -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.texera.workflow.LogicalLink import org.scalatest.flatspec.AnyFlatSpecLike import org.scalatest.{BeforeAndAfterAll, BeforeAndAfterEach} diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/PauseSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/PauseSpec.scala index 014f3080b98..c1b0f319c50 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/PauseSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/e2e/PauseSpec.scala @@ -15,7 +15,7 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.WorkflowAggregat import edu.uci.ics.amber.engine.common.AmberRuntime import edu.uci.ics.amber.engine.common.client.AmberClient import edu.uci.ics.amber.operator.{LogicalOp, TestOperators} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.texera.workflow.LogicalLink import org.scalatest.BeforeAndAfterAll import org.scalatest.flatspec.AnyFlatSpecLike diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/CheckpointSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/CheckpointSpec.scala index 6c694b989c1..cd423741b10 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/CheckpointSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/CheckpointSpec.scala @@ -12,7 +12,7 @@ import edu.uci.ics.amber.engine.common.virtualidentity.util.{CONTROLLER, SELF} import edu.uci.ics.amber.engine.common.{AmberRuntime, CheckpointState} import edu.uci.ics.amber.engine.e2e.TestUtils.buildWorkflow import edu.uci.ics.amber.operator.TestOperators -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.texera.workflow.LogicalLink import org.scalatest.BeforeAndAfterAll import org.scalatest.flatspec.AnyFlatSpecLike diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/LoggingSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/LoggingSpec.scala index 42962ef3352..fade388c4ff 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/LoggingSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/LoggingSpec.scala @@ -25,13 +25,13 @@ import edu.uci.ics.amber.engine.common.ambermessage.{ import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient.ControlInvocation import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage import edu.uci.ics.amber.engine.common.virtualidentity.util.{CONTROLLER, SELF} -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, ChannelIdentity, OperatorIdentity, PhysicalOpIdentity } -import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import org.scalatest.BeforeAndAfterAll import org.scalatest.flatspec.AnyFlatSpecLike diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/ReplaySpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/ReplaySpec.scala index d52402798ea..3a2a9b9ba88 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/ReplaySpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/ReplaySpec.scala @@ -16,7 +16,7 @@ import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient.ControlInvocation import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage import edu.uci.ics.amber.engine.common.storage.SequentialRecordStorage.SequentialRecordReader import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER -import edu.uci.ics.amber.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ActorVirtualIdentity, ChannelIdentity} import org.scalatest.BeforeAndAfterAll import org.scalatest.flatspec.AnyFlatSpecLike diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala index 4a50f33f806..6dd8ebbce50 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala @@ -10,10 +10,10 @@ import edu.uci.ics.amber.compiler.WorkflowCompiler.{ import edu.uci.ics.amber.compiler.model.{LogicalPlan, LogicalPlanPojo} import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalPlan, WorkflowContext} -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PhysicalLink -import edu.uci.ics.amber.workflowruntimestate.FatalErrorType.COMPILATION_ERROR -import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.workflow.PhysicalLink +import edu.uci.ics.amber.core.workflowruntimestate.FatalErrorType.COMPILATION_ERROR +import edu.uci.ics.amber.core.workflowruntimestate.WorkflowFatalError import java.time.Instant import scala.collection.mutable diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalLink.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalLink.scala index 283526132a5..bf9217128b2 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalLink.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalLink.scala @@ -1,8 +1,8 @@ package edu.uci.ics.amber.compiler.model import com.fasterxml.jackson.annotation.{JsonCreator, JsonProperty} -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity case class LogicalLink( @JsonProperty("fromOpId") fromOpId: OperatorIdentity, diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala index 8b599176cd7..ea79ba5ceb7 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.workflow.WorkflowContext import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import org.jgrapht.graph.DirectedAcyclicGraph import org.jgrapht.util.SupplierUtil diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResource.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResource.scala index a167e46203b..1492d7f7d48 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResource.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResource.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.compiler.WorkflowCompiler import edu.uci.ics.amber.compiler.model.LogicalPlanPojo import edu.uci.ics.amber.core.tuple.Attribute import edu.uci.ics.amber.core.workflow.{PhysicalPlan, WorkflowContext} -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity -import edu.uci.ics.amber.workflowruntimestate.WorkflowFatalError +import edu.uci.ics.amber.core.virtualidentity.WorkflowIdentity +import edu.uci.ics.amber.core.workflowruntimestate.WorkflowFatalError import jakarta.annotation.security.RolesAllowed import jakarta.ws.rs.{Consumes, POST, Path, Produces} import jakarta.ws.rs.core.MediaType diff --git a/core/workflow-compiling-service/src/test/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResourceSpec.scala b/core/workflow-compiling-service/src/test/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResourceSpec.scala index 669df1e6e60..94150bee341 100644 --- a/core/workflow-compiling-service/src/test/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResourceSpec.scala +++ b/core/workflow-compiling-service/src/test/scala/edu/uci/ics/texera/service/resource/WorkflowCompilationResourceSpec.scala @@ -8,7 +8,7 @@ import edu.uci.ics.amber.compiler.model.{LogicalLink, LogicalPlanPojo} import edu.uci.ics.amber.operator.projection.{AttributeUnit, ProjectionOpDesc} import edu.uci.ics.amber.operator.source.scan.csv.CSVScanSourceOpDesc import edu.uci.ics.amber.util.JSONUtils.objectMapper -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import org.scalatest.flatspec.AnyFlatSpec import org.scalatest.BeforeAndAfterAll import com.fasterxml.jackson.databind.node.ObjectNode diff --git a/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/virtualidentity.proto b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/virtualidentity.proto similarity index 95% rename from core/workflow-core/src/main/protobuf/edu/uci/ics/amber/virtualidentity.proto rename to core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/virtualidentity.proto index 272e94954a1..e8ed027ab0a 100644 --- a/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/virtualidentity.proto +++ b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/virtualidentity.proto @@ -1,6 +1,6 @@ syntax = "proto3"; -package edu.uci.ics.amber; +package edu.uci.ics.amber.core; import "scalapb/scalapb.proto"; diff --git a/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflow.proto b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/workflow.proto similarity index 93% rename from core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflow.proto rename to core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/workflow.proto index 0ee4c68d36a..55180f3aa6d 100644 --- a/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflow.proto +++ b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/workflow.proto @@ -1,8 +1,8 @@ syntax = "proto3"; -package edu.uci.ics.amber; +package edu.uci.ics.amber.core; -import "edu/uci/ics/amber/virtualidentity.proto"; +import "edu/uci/ics/amber/core/virtualidentity.proto"; import "scalapb/scalapb.proto"; option (scalapb.options) = { diff --git a/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflowruntimestate.proto b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/workflowruntimestate.proto similarity index 94% rename from core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflowruntimestate.proto rename to core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/workflowruntimestate.proto index 60af1c1e3a7..2666e7074a6 100644 --- a/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/workflowruntimestate.proto +++ b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/workflowruntimestate.proto @@ -1,6 +1,6 @@ syntax = "proto3"; -package edu.uci.ics.amber; +package edu.uci.ics.amber.core; import "google/protobuf/timestamp.proto"; import "scalapb/scalapb.proto"; diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/WorkflowRuntimeException.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/WorkflowRuntimeException.scala index ba333a5f499..443a3b3cf4f 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/WorkflowRuntimeException.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/WorkflowRuntimeException.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.core -import edu.uci.ics.amber.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity class WorkflowRuntimeException( val message: String, diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/OperatorExecutor.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/OperatorExecutor.scala index f286d7e6aca..a718b4a409f 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/OperatorExecutor.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/OperatorExecutor.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.core.executor import edu.uci.ics.amber.core.marker.State import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity trait OperatorExecutor { diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/SinkOperatorExecutor.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/SinkOperatorExecutor.scala index 50ec79f3a8e..9b4cd31b84d 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/SinkOperatorExecutor.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/SinkOperatorExecutor.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.core.executor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity trait SinkOperatorExecutor extends OperatorExecutor { diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/SourceOperatorExecutor.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/SourceOperatorExecutor.scala index efa6fd40bcc..84685c59a59 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/SourceOperatorExecutor.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/SourceOperatorExecutor.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.core.executor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity trait SourceOperatorExecutor extends OperatorExecutor { override def open(): Unit = {} diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala index 42728231270..83c25a2488b 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/OpResultStorage.scala @@ -4,8 +4,8 @@ import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.storage.StorageConfig import edu.uci.ics.amber.core.storage.model.VirtualDocument import edu.uci.ics.amber.core.tuple.{Schema, Tuple} -import edu.uci.ics.amber.virtualidentity.OperatorIdentity -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import java.util.concurrent.ConcurrentHashMap import scala.jdk.CollectionConverters.IteratorHasAsScala diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/ResultStorage.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/ResultStorage.scala index 1e9168d15a2..701ee9c01fe 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/ResultStorage.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/ResultStorage.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.core.storage.result -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity +import edu.uci.ics.amber.core.virtualidentity.WorkflowIdentity import scala.collection.mutable diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/WorkflowResultStore.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/WorkflowResultStore.scala index beb83632147..8dfd30e2f62 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/WorkflowResultStore.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/result/WorkflowResultStore.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.core.storage.result -import edu.uci.ics.amber.virtualidentity.OperatorIdentity +import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity case class OperatorResultMetadata(tupleCount: Int = 0, changeDetector: String = "") diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/util/mongo/MongoCollectionManager.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/util/mongo/MongoCollectionManager.scala index 96cd5c902a9..b3f650450f8 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/util/mongo/MongoCollectionManager.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/util/mongo/MongoCollectionManager.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.core.storage.util.mongo -import com.mongodb.client.model.{Aggregates, IndexOptions, Indexes, Sorts} +import com.mongodb.client.model.{Aggregates, Sorts} import com.mongodb.client.{FindIterable, MongoCollection} import org.bson.Document diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/TupleLike.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/TupleLike.scala index 16eddf5a8f9..206590d4d5f 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/TupleLike.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/TupleLike.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.core.tuple -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import scala.jdk.CollectionConverters.CollectionHasAsScala diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala index 99631e4aa46..daf7cd679f9 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala @@ -4,13 +4,13 @@ import com.fasterxml.jackson.annotation.{JsonIgnore, JsonIgnoreProperties} import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.executor.{OpExecInitInfo, OpExecInitInfoWithCode} import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ExecutionIdentity, OperatorIdentity, PhysicalOpIdentity, WorkflowIdentity } -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalLink, PortIdentity} import org.jgrapht.graph.{DefaultEdge, DirectedAcyclicGraph} import org.jgrapht.traverse.TopologicalOrderIterator diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalPlan.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalPlan.scala index ce8070a1aa7..a405ea646da 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalPlan.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalPlan.scala @@ -3,12 +3,12 @@ package edu.uci.ics.amber.core.workflow import com.fasterxml.jackson.annotation.JsonIgnore import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.util.VirtualIdentityUtils -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, OperatorIdentity, PhysicalOpIdentity } -import edu.uci.ics.amber.workflow.PhysicalLink +import edu.uci.ics.amber.core.workflow.PhysicalLink import org.jgrapht.alg.connectivity.BiconnectivityInspector import org.jgrapht.alg.shortestpath.AllDirectedPaths import org.jgrapht.graph.DirectedAcyclicGraph diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/WorkflowContext.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/WorkflowContext.scala index abb776b1cad..28b9bb31858 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/WorkflowContext.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/WorkflowContext.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.core.workflow.WorkflowContext.{ DEFAULT_WORKFLOW_ID, DEFAULT_WORKFLOW_SETTINGS } -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} object WorkflowContext { val DEFAULT_EXECUTION_ID: ExecutionIdentity = ExecutionIdentity(1L) diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/JSONUtils.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/JSONUtils.scala index d87145c5000..9156ecc6088 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/JSONUtils.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/JSONUtils.scala @@ -6,7 +6,7 @@ import com.fasterxml.jackson.databind.{JsonNode, ObjectMapper} import com.fasterxml.jackson.module.noctordeser.NoCtorDeserModule import com.fasterxml.jackson.module.scala.DefaultScalaModule import edu.uci.ics.amber.util.serde.{PortIdentityKeyDeserializer, PortIdentityKeySerializer} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import java.text.SimpleDateFormat import scala.jdk.CollectionConverters.IteratorHasAsScala diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/VirtualIdentityUtils.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/VirtualIdentityUtils.scala index f38bb9577bb..9b30d85ee2e 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/VirtualIdentityUtils.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/VirtualIdentityUtils.scala @@ -1,6 +1,6 @@ package edu.uci.ics.amber.util -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, OperatorIdentity, PhysicalOpIdentity, diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/serde/PortIdentityKeyDeserializer.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/serde/PortIdentityKeyDeserializer.scala index 8f2691540fd..9105d4bb661 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/serde/PortIdentityKeyDeserializer.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/serde/PortIdentityKeyDeserializer.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.util.serde import com.fasterxml.jackson.databind.{DeserializationContext, KeyDeserializer} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity class PortIdentityKeyDeserializer extends KeyDeserializer { override def deserializeKey(key: String, ctxt: DeserializationContext): PortIdentity = { diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/serde/PortIdentityKeySerializer.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/serde/PortIdentityKeySerializer.scala index 1bf3a7f276f..a18a3970be3 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/serde/PortIdentityKeySerializer.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/serde/PortIdentityKeySerializer.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.util.serde import com.fasterxml.jackson.core.JsonGenerator import com.fasterxml.jackson.databind.{JsonSerializer, SerializerProvider} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity class PortIdentityKeySerializer extends JsonSerializer[PortIdentity] { override def serialize( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala index e374fac80c7..08e750ed8c3 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala @@ -94,8 +94,12 @@ import edu.uci.ics.amber.operator.visualization.ternaryPlot.TernaryPlotOpDesc import edu.uci.ics.amber.operator.visualization.urlviz.UrlVizOpDesc import edu.uci.ics.amber.operator.visualization.waterfallChart.WaterfallChartOpDesc import edu.uci.ics.amber.operator.visualization.wordCloud.WordCloudOpDesc -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, OperatorIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.{ + ExecutionIdentity, + OperatorIdentity, + WorkflowIdentity +} +import edu.uci.ics.amber.core.workflow.PortIdentity import org.apache.commons.lang3.builder.{EqualsBuilder, HashCodeBuilder, ToStringBuilder} import java.util.UUID diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala index 04991ef2530..c5cc4fd152f 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator import edu.uci.ics.amber.core.executor.OpExecInitInfoWithCode import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} trait PythonOperatorDescriptor extends LogicalOp { override def getPhysicalOp( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala index 64028a3fef8..96776d36b62 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala @@ -7,15 +7,18 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.sink.ProgressiveUtils import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpExec import edu.uci.ics.amber.operator.source.cache.CacheSourceOpExec -import edu.uci.ics.amber.virtualidentity.{ +import edu.uci.ics.amber.core.virtualidentity.{ ExecutionIdentity, - OperatorIdentity, PhysicalOpIdentity, WorkflowIdentity } -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.OutputPort.OutputMode.{SET_DELTA, SET_SNAPSHOT, SINGLE_SNAPSHOT} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode.{ + SET_DELTA, + SET_SNAPSHOT, + SINGLE_SNAPSHOT +} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} object SpecialPhysicalOpFactory { def newSinkPhysicalOp( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala index 84559fbf6ab..b27c9cac387 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala @@ -13,8 +13,12 @@ import edu.uci.ics.amber.core.workflow.{ import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeNameList import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, PhysicalOpIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ + ExecutionIdentity, + PhysicalOpIdentity, + WorkflowIdentity +} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalLink, PortIdentity} import javax.validation.constraints.{NotNull, Size} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala index 8ca092feb6b..2be0c18e598 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala @@ -5,8 +5,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class CartesianProductOpDesc extends LogicalOp { override def getPhysicalOp( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala index b3ef4ea3538..6921cccfba7 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala @@ -8,8 +8,8 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} /** * Dictionary matcher operator matches a tuple if the specified column is in the given dictionary. diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala index 8df2eaac6a5..a8c25ad2363 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class DifferenceOpDesc extends LogicalOp { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala index 37d9e264c18..ae00eb38c10 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class DistinctOpDesc extends LogicalOp { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dummy/DummyOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dummy/DummyOpDesc.scala index a434800007c..75ce5a933cd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dummy/DummyOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dummy/DummyOpDesc.scala @@ -5,7 +5,7 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.{LogicalOp, PortDescription, PortDescriptor} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class DummyOpDesc extends LogicalOp with PortDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpDesc.scala index 4e871e158d6..28c5e44a981 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpDesc.scala @@ -4,7 +4,7 @@ import com.google.common.base.Preconditions import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.{LogicalOp, StateTransferFunc} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import scala.util.{Success, Try} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala index 770334dc396..61b87009377 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala @@ -4,8 +4,8 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.executor.OpExecInitInfo import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class SpecializedFilterOpDesc extends FilterOpDesc { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala index c40d736d135..fe009a91989 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala @@ -6,8 +6,12 @@ import edu.uci.ics.amber.core.executor.OpExecInitInfo import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow._ import edu.uci.ics.amber.operator.LogicalOp -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, PhysicalOpIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ + ExecutionIdentity, + PhysicalOpIdentity, + WorkflowIdentity +} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalLink, PortIdentity} import edu.uci.ics.amber.operator.hashJoin.HashJoinOpDesc.HASH_JOIN_INTERNAL_KEY_NAME import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala index 6ca510bc91f..a4efb8226cb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class HuggingFaceIrisLogisticRegressionOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "petalLengthCmAttribute", required = true) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala index 339d042a204..04a603ed85c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.huggingFace import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala index 2d7d610e9c8..cf1c43dd701 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class HuggingFaceSpamSMSDetectionOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "attribute", required = true) @JsonPropertyDescription("column to perform spam detection on") diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala index 0d9731be16c..349842369fb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.huggingFace import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala index 6ee770cd7d5..8fc2e999ee7 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class IntersectOpDesc extends LogicalOp { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala index 4c7dccbe6eb..d27792c044d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala @@ -12,8 +12,8 @@ import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, AutofillAttributeNameOnPort1 } -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} /** This Operator have two assumptions: * 1. The tuples in both inputs come in ascending order diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpDesc.scala index a6c9ddfdda1..cf478610d56 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpDesc.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.filter.FilterOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class KeywordSearchOpDesc extends FilterOpDesc { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala index 6194620b214..b3cf15a0e40 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.{LogicalOp, StateTransferFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import scala.util.{Success, Try} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala index 16024cd6e8f..8183cf14e4c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala @@ -10,7 +10,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.{AutofillAttributeName, HideAnnotation} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class MachineLearningScorerOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true, defaultValue = "false") diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala index 2589e54687d..66467291eb0 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.machineLearning.sklearnAdvanced.base import com.fasterxml.jackson.annotation.{JsonIgnore, JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/map/MapOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/map/MapOpDesc.scala index d777d2bce14..5cce0ad9fb3 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/map/MapOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/map/MapOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.map import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.{LogicalOp, StateTransferFunc} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import scala.util.{Failure, Success, Try} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/metadata/OperatorMetadataGenerator.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/metadata/OperatorMetadataGenerator.scala index 48b3cd23515..24ee456c1c4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/metadata/OperatorMetadataGenerator.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/metadata/OperatorMetadataGenerator.scala @@ -7,7 +7,7 @@ import com.fasterxml.jackson.databind.node.{ArrayNode, ObjectNode} import com.kjetland.jackson.jsonSchema.JsonSchemaConfig.html5EnabledSchema import com.kjetland.jackson.jsonSchema.{JsonSchemaConfig, JsonSchemaDraft, JsonSchemaGenerator} import edu.uci.ics.amber.operator.LogicalOp -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.source.scan.csv.CSVScanSourceOpDesc import edu.uci.ics.amber.util.JSONUtils.objectMapper diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala index f7051c07885..2bd6fc413fd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala @@ -9,8 +9,8 @@ import edu.uci.ics.amber.core.workflow.PhysicalOp.oneToOnePhysicalOp import edu.uci.ics.amber.core.workflow._ import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class ProjectionOpDesc extends MapOpDesc { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpDesc.scala index 8e7f0601b2c..7fa187e70ee 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpDesc.scala @@ -5,8 +5,8 @@ import edu.uci.ics.amber.core.executor.OpExecInitInfo import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.filter.FilterOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import scala.util.Random diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpDesc.scala index 45f58edafd8..6d06c839943 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpDesc.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.filter.FilterOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class RegexOpDesc extends FilterOpDesc { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala index 8abc8801419..79ab7eadf1e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.util.OperatorDescriptorUtils.equallyPartitionGoal import scala.util.Random diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala index 27a988d9f37..247e7893f23 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala @@ -8,8 +8,8 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @JsonSchemaInject(json = """ { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala index 30a604ea924..bd0bd187c23 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala @@ -5,9 +5,9 @@ import edu.uci.ics.amber.core.storage.model.BufferedItemWriter import edu.uci.ics.amber.core.storage.result.ResultStorage import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} import edu.uci.ics.amber.operator.sink.ProgressiveUtils -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.virtualidentity.WorkflowIdentity +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.PortIdentity class ProgressiveSinkOpExec( outputMode: OutputMode, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala index f41401dac12..09881901612 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala @@ -15,7 +15,7 @@ import edu.uci.ics.amber.operator.metadata.annotations.{ CommonOpDescAnnotation, HideAnnotation } -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} abstract class SklearnClassifierOpDesc extends PythonOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala index d464e41992d..a55cb953395 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class SklearnLinearRegressionOpDesc extends PythonOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala index 21a2595b537..a1d4c86eb7e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala @@ -8,7 +8,7 @@ import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, AutofillAttributeNameOnPort1 } -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class SklearnPredictionOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "Model Attribute", required = true, defaultValue = "model") diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sort/SortOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sort/SortOpDesc.scala index 4836001fc8d..39af6cd63a9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sort/SortOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sort/SortOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.sort import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} class SortOpDesc extends PythonOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala index b13f5bfe88d..73366908cb1 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala @@ -9,8 +9,8 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, RangePartition} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @JsonSchemaInject(json = """ { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala index b42dc307ac6..bf2c9336d83 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala @@ -5,7 +5,7 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.PythonSourceOperatorDescriptor -import edu.uci.ics.amber.workflow.OutputPort +import edu.uci.ics.amber.core.workflow.OutputPort class RedditSearchSourceOpDesc extends PythonSourceOperatorDescriptor { @JsonProperty(required = true) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/TwitterSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/TwitterSourceOpDesc.scala index be6674e0bb3..79ddc0f3933 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/TwitterSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/TwitterSourceOpDesc.scala @@ -4,7 +4,7 @@ import com.fasterxml.jackson.annotation.{JsonIgnore, JsonProperty} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaDescription, JsonSchemaTitle} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor -import edu.uci.ics.amber.workflow.OutputPort +import edu.uci.ics.amber.core.workflow.OutputPort abstract class TwitterSourceOpDesc extends SourceOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala index 6b637431c66..5b17a08d23a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala @@ -11,7 +11,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.apis.twitter.TwitterSourceOpDesc -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} class TwitterFullArchiveSearchSourceOpDesc extends TwitterSourceOpDesc { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala index aecea184a3c..39d5bd697bb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala @@ -11,7 +11,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.apis.twitter.TwitterSourceOpDesc -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} class TwitterSearchSourceOpDesc extends TwitterSourceOpDesc { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpExec.scala index ba335d1d5c9..f8dac9d938f 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/cache/CacheSourceOpExec.scala @@ -4,7 +4,7 @@ import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.executor.SourceOperatorExecutor import edu.uci.ics.amber.core.storage.result.ResultStorage import edu.uci.ics.amber.core.tuple.TupleLike -import edu.uci.ics.amber.virtualidentity.WorkflowIdentity +import edu.uci.ics.amber.core.virtualidentity.WorkflowIdentity class CacheSourceOpExec(storageKey: String, workflowIdentity: WorkflowIdentity) extends SourceOperatorExecutor diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala index 458a6f50c9b..38ce1e997e0 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.OutputPort +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort class URLFetcherOpDesc extends SourceOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala index 15a7559af7f..5902e0e030c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala @@ -11,7 +11,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.annotations.HideAnnotation import edu.uci.ics.amber.operator.source.scan.text.TextSourceOpDesc -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} @JsonIgnoreProperties(value = Array("limit", "offset", "fileEncoding")) class FileScanSourceOpDesc extends ScanSourceOpDesc with TextSourceOpDesc { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala index 80aee74cf63..919c35f6cb4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala @@ -6,7 +6,7 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor -import edu.uci.ics.amber.workflow.OutputPort +import edu.uci.ics.amber.core.workflow.OutputPort import org.apache.commons.lang3.builder.EqualsBuilder import java.net.URI diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpDesc.scala index f88125ebaeb..96f48fdb75d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpDesc.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.util.ArrowUtils import java.io.IOException diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala index 26bc17e77de..6d9fc7a5d22 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala @@ -10,7 +10,7 @@ import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.inferSchemaFromRows import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import java.io.{IOException, InputStreamReader} import java.net.URI diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala index 2337d60624a..eb50cbe0910 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala @@ -9,7 +9,7 @@ import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.inferSchemaFromRows import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc import java.io.IOException diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala index dda17c65985..f3b2ea1f2e6 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala @@ -9,7 +9,7 @@ import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.inferSchemaFromRows import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc import java.io.IOException diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala index ec540537829..0be43a62c39 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala @@ -10,7 +10,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc import edu.uci.ics.amber.util.JSONUtils.{JSONToMap, objectMapper} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import java.io._ import java.net.URI diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala index e4857a28e57..e3aaec7da42 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala @@ -8,8 +8,8 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.OutputPort +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort class TextInputSourceOpDesc extends SourceOperatorDescriptor with TextSourceOpDesc { @JsonProperty(required = true) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala index bc1f8a86e21..8ab3249d909 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala @@ -17,8 +17,8 @@ import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeNameList, UIWidget } -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.OutputPort +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort import edu.uci.ics.amber.operator.source.sql.SQLSourceOpDesc import edu.uci.ics.amber.operator.source.sql.asterixdb.AsterixDBConnUtil.{ fetchDataTypeFields, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpDesc.scala index b6ebd899491..073e900e658 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpDesc.scala @@ -5,8 +5,8 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.sql.SQLSourceOpDesc import edu.uci.ics.amber.operator.source.sql.mysql.MySQLConnUtil.connect -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.OutputPort +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort import java.sql.{Connection, SQLException} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpDesc.scala index 0a7c10193e6..4abc00c2c6b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpDesc.scala @@ -9,8 +9,8 @@ import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.sql.SQLSourceOpDesc import edu.uci.ics.amber.operator.source.sql.postgresql.PostgreSQLConnUtil.connect -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.OutputPort +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort import java.sql.{Connection, SQLException} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala index 1c2b3b80b09..1aab480e7d5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import scala.util.Random class SplitOpDesc extends LogicalOp { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpExec.scala index 3790faf3ba5..848982a1e20 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpExec.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.split import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import scala.util.Random diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala index 8f6a8161bbf..94a2ac3b852 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class SymmetricDifferenceOpDesc extends LogicalOp { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala index 91741b73d05..eca814f491c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.tuple.{AttributeTypeUtils, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class TypeCastingOpDesc extends MapOpDesc { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala index baa2974723c..a3fa40a4e01 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala @@ -13,8 +13,8 @@ import edu.uci.ics.amber.core.workflow.{ } import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.{LogicalOp, PortDescription, StateTransferFunc} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import scala.util.{Success, Try} class JavaUDFOpDesc extends LogicalOp { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala index 3f10db09099..24d6bb62549 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala @@ -8,8 +8,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc, UnknownPartition} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class DualInputPortsPythonUDFOpDescV2 extends LogicalOp { @JsonProperty( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala index 22d787cad24..aa016c2740e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala @@ -5,7 +5,7 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{AttributeTypeUtils, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class PythonLambdaFunctionOpDesc extends PythonOperatorDescriptor { @JsonSchemaTitle("Add/Modify column(s)") diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala index 4e533940b0e..aa36eaced06 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala @@ -5,7 +5,7 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class PythonTableReducerOpDesc extends PythonOperatorDescriptor { @JsonSchemaTitle("Output columns") diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala index e9f42ec6c51..3ce08b1510a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala @@ -13,8 +13,8 @@ import edu.uci.ics.amber.core.workflow.{ } import edu.uci.ics.amber.operator.{LogicalOp, PortDescription, StateTransferFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import scala.util.{Success, Try} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala index 91642ec8b3e..3086d8e6762 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{OutputPort, PortIdentity} class PythonUDFSourceOpDescV2 extends SourceOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala index 0489645ca6d..94f31d02f05 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala @@ -13,8 +13,8 @@ import edu.uci.ics.amber.core.workflow.{ } import edu.uci.ics.amber.operator.{LogicalOp, PortDescription, StateTransferFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import scala.util.{Success, Try} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala index 7dd22c92436..afb2e2524e4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{OutputPort, PortIdentity} class RUDFSourceOpDesc extends SourceOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala index 617a0ca1264..6e6efcc1d0c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class UnionOpDesc extends LogicalOp { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala index abf033f4d1b..26b36b410dc 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala @@ -8,8 +8,8 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.flatmap.FlatMapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class UnnestStringOpDesc extends FlatMapOpDesc { @JsonProperty(value = "Delimiter", required = true, defaultValue = ",") diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala index 5cf03a51a6c..c6e6a46d34a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala @@ -4,8 +4,8 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala index bad227b992e..7034e9f6bb9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala @@ -7,8 +7,8 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.visualization.hierarchychart.HierarchySection -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} // type constraint: value can only be numeric @JsonSchemaInject(json = """ diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala index cbb10911101..a1389c426d8 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala @@ -3,8 +3,8 @@ package edu.uci.ics.amber.operator.visualization.ImageViz import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.PythonOperatorDescriptor diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala index 8e928f877bb..ea84feee197 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala @@ -3,14 +3,14 @@ package edu.uci.ics.amber.operator.visualization.ScatterMatrixChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, AutofillAttributeNameList } import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @JsonSchemaInject(json = """ { "attributeTypeRules": { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala index 64e5d6f8a60..e3a8705cab0 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} //type constraint: value can only be numeric @JsonSchemaInject(json = """ diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala index 7e48e5cd43e..064992b96d5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala @@ -3,8 +3,8 @@ package edu.uci.ics.amber.operator.visualization.boxPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.PythonOperatorDescriptor diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala index 58c1a916e0e..aa589a33c24 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala @@ -4,8 +4,8 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala index 39085aa76e1..a344e3fb6d6 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class CandlestickChartOpDesc extends PythonOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala index 189d238ee3b..e81d1d87f78 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala @@ -5,8 +5,8 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import java.util import scala.jdk.CollectionConverters.ListHasAsScala diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala index 86854f41685..14001e6d3ec 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class ContourPlotOpDesc extends PythonOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala index 340e9768bec..eb933ce627d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala @@ -3,8 +3,8 @@ package edu.uci.ics.amber.operator.visualization.dumbbellPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala index 862fe472b32..6c3d93b6f2e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala @@ -5,8 +5,8 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class FigureFactoryTableOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = false) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala index bb85ae82741..106f424bbc1 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala @@ -4,10 +4,10 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode class FilledAreaPlotOpDesc extends PythonOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala index 91c75660045..c9a32bc8044 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @JsonSchemaInject(json = """ { "attributeTypeRules": { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala index d130a8db7cf..2a34113e9fb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala @@ -4,10 +4,10 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @JsonSchemaInject(json = """ { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala index 8f5837affa3..8c11038d25e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class HeatMapOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "x", required = true) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala index 33d61e637e8..b23f7b511c2 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.PythonOperatorDescriptor // type constraint: value can only be numeric @JsonSchemaInject(json = """ diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala index b1470a9e23e..044dad14065 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala @@ -3,8 +3,8 @@ package edu.uci.ics.amber.operator.visualization.histogram import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.PythonOperatorDescriptor diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala index 9ff91b96bdd..5f2696c3cb9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala @@ -8,9 +8,9 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} /** * HTML Visualization operator to render any given HTML code diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala index 742ede861cd..e7deebe579d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala @@ -5,8 +5,8 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import java.util import scala.jdk.CollectionConverters.ListHasAsScala diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala index 133e6763781..923ca5a619a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala @@ -4,8 +4,8 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala index 8cff995fa83..054d02b8090 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala @@ -4,8 +4,8 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala index 3728bb55309..f00c164743b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class SankeyDiagramOpDesc extends PythonOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala index aaba3fff092..d0e71870398 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala @@ -3,11 +3,11 @@ package edu.uci.ics.amber.operator.visualization.scatter3DChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @JsonSchemaInject(json = """ { "attributeTypeRules": { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala index 8f522388109..8195441602b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala @@ -4,10 +4,10 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @JsonSchemaInject( json = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala index 648d4355b85..941d681a6ed 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala @@ -4,8 +4,8 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class TablesPlotOpDesc extends PythonOperatorDescriptor { @JsonPropertyDescription("List of columns to include in the table chart") diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala index 9f8059843c4..91db1d5e1b2 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} /** * Visualization Operator for Ternary Plots. diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala index db9fa71c891..7fe381c2d28 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala @@ -6,11 +6,11 @@ import edu.uci.ics.amber.core.executor.OpExecInitInfo import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp -import edu.uci.ics.amber.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode /** * URL Visualization operator to render any content in given URL link diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala index 236078d6e9b..15bee2d2506 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala @@ -6,8 +6,8 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class WaterfallChartOpDesc extends PythonOperatorDescriptor { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala index 9659504c5fe..516bc3ab3b4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala @@ -11,8 +11,8 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.visualization.ImageUtility -import edu.uci.ics.amber.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class WordCloudOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("Text column") diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intersect/IntersectOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intersect/IntersectOpExecSpec.scala index f0903bdeaf0..03c10c310b6 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intersect/IntersectOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intersect/IntersectOpExecSpec.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.operator.intersect -import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, PhysicalOpIdentity} -import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{OperatorIdentity, PhysicalOpIdentity} +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala index 501fbe8a1ab..c21d2308791 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.operator.intervalJoin -import edu.uci.ics.amber.virtualidentity.{OperatorIdentity, PhysicalOpIdentity} -import edu.uci.ics.amber.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.virtualidentity.{OperatorIdentity, PhysicalOpIdentity} +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala index 06dd412b8b9..cb9d031952b 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala @@ -4,7 +4,7 @@ import edu.uci.ics.amber.core.storage.FileResolver import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.WorkflowContext.{DEFAULT_EXECUTION_ID, DEFAULT_WORKFLOW_ID} import edu.uci.ics.amber.operator.TestOperators -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala index 48620b56717..96cfb08acde 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.operator.unneststring import edu.uci.ics.amber.core.tuple._ -import edu.uci.ics.amber.workflow.PortIdentity +import edu.uci.ics.amber.core.workflow.PortIdentity import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { From 1d3561b7c4202f45db5f7443e9e39f798fb4667d Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Mon, 30 Dec 2024 16:46:45 -0800 Subject: [PATCH 20/47] Remove duplicated scalapb definition (#3182) The scalapb proto definition both present in workflow-core and amber. This PR removes the second copy. --- .../src/main/protobuf/scalapb/scalapb.proto | 363 ------------------ .../src/main/protobuf/scalapb/scalapb.proto | 10 +- 2 files changed, 5 insertions(+), 368 deletions(-) delete mode 100644 core/amber/src/main/protobuf/scalapb/scalapb.proto diff --git a/core/amber/src/main/protobuf/scalapb/scalapb.proto b/core/amber/src/main/protobuf/scalapb/scalapb.proto deleted file mode 100644 index d35d373f391..00000000000 --- a/core/amber/src/main/protobuf/scalapb/scalapb.proto +++ /dev/null @@ -1,363 +0,0 @@ -syntax = "proto2"; - -package scalapb; - -option java_package = "scalapb.options"; - -option (options) = { - package_name: "scalapb.options" - flat_package: true -}; - -import "google/protobuf/descriptor.proto"; - -message ScalaPbOptions { - // If set then it overrides the java_package and package. - optional string package_name = 1; - - // If true, the compiler does not append the proto base file name - // into the generated package name. If false (the default), the - // generated scala package name is the package_name.basename where - // basename is the proto file name without the .proto extension. - optional bool flat_package = 2; - - // Adds the following imports at the top of the file (this is meant - // to provide implicit TypeMappers) - repeated string import = 3; - - // Text to add to the generated scala file. This can be used only - // when single_file is true. - repeated string preamble = 4; - - // If true, all messages and enums (but not services) will be written - // to a single Scala file. - optional bool single_file = 5; - - // By default, wrappers defined at - // https://github.com/google/protobuf/blob/master/src/google/protobuf/wrappers.proto, - // are mapped to an Option[T] where T is a primitive type. When this field - // is set to true, we do not perform this transformation. - optional bool no_primitive_wrappers = 7; - - // DEPRECATED. In ScalaPB <= 0.5.47, it was necessary to explicitly enable - // primitive_wrappers. This field remains here for backwards compatibility, - // but it has no effect on generated code. It is an error to set both - // `primitive_wrappers` and `no_primitive_wrappers`. - optional bool primitive_wrappers = 6; - - // Scala type to be used for repeated fields. If unspecified, - // `scala.collection.Seq` will be used. - optional string collection_type = 8; - - // If set to true, all generated messages in this file will preserve unknown - // fields. - optional bool preserve_unknown_fields = 9 [default = true]; - - // If defined, sets the name of the file-level object that would be generated. This - // object extends `GeneratedFileObject` and contains descriptors, and list of message - // and enum companions. - optional string object_name = 10; - - // Whether to apply the options only to this file, or for the entire package (and its subpackages) - enum OptionsScope { - // Apply the options for this file only (default) - FILE = 0; - - // Apply the options for the entire package and its subpackages. - PACKAGE = 1; - } - // Experimental: scope to apply the given options. - optional OptionsScope scope = 11; - - // If true, lenses will be generated. - optional bool lenses = 12 [default = true]; - - // If true, then source-code info information will be included in the - // generated code - normally the source code info is cleared out to reduce - // code size. The source code info is useful for extracting source code - // location from the descriptors as well as comments. - optional bool retain_source_code_info = 13; - - // Scala type to be used for maps. If unspecified, - // `scala.collection.immutable.Map` will be used. - optional string map_type = 14; - - // If true, no default values will be generated in message constructors. - optional bool no_default_values_in_constructor = 15; - - /* Naming convention for generated enum values */ - enum EnumValueNaming { - AS_IN_PROTO = 0; // Enum value names in Scala use the same name as in the proto - CAMEL_CASE = 1; // Convert enum values to CamelCase in Scala. - } - optional EnumValueNaming enum_value_naming = 16; - - // Indicate if prefix (enum name + optional underscore) should be removed in scala code - // Strip is applied before enum value naming changes. - optional bool enum_strip_prefix = 17 [default = false]; - - // Scala type to use for bytes fields. - optional string bytes_type = 21; - - // Enable java conversions for this file. - optional bool java_conversions = 23; - - // AuxMessageOptions enables you to set message-level options through package-scoped options. - // This is useful when you can't add a dependency on scalapb.proto from the proto file that - // defines the message. - message AuxMessageOptions { - // The fully-qualified name of the message in the proto name space. - optional string target = 1; - - // Options to apply to the message. If there are any options defined on the target message - // they take precedence over the options. - optional MessageOptions options = 2; - } - - // AuxFieldOptions enables you to set field-level options through package-scoped options. - // This is useful when you can't add a dependency on scalapb.proto from the proto file that - // defines the field. - message AuxFieldOptions { - // The fully-qualified name of the field in the proto name space. - optional string target = 1; - - // Options to apply to the field. If there are any options defined on the target message - // they take precedence over the options. - optional FieldOptions options = 2; - } - - // AuxEnumOptions enables you to set enum-level options through package-scoped options. - // This is useful when you can't add a dependency on scalapb.proto from the proto file that - // defines the enum. - message AuxEnumOptions { - // The fully-qualified name of the enum in the proto name space. - optional string target = 1; - - // Options to apply to the enum. If there are any options defined on the target enum - // they take precedence over the options. - optional EnumOptions options = 2; - } - - // AuxEnumValueOptions enables you to set enum value level options through package-scoped - // options. This is useful when you can't add a dependency on scalapb.proto from the proto - // file that defines the enum. - message AuxEnumValueOptions { - // The fully-qualified name of the enum value in the proto name space. - optional string target = 1; - - // Options to apply to the enum value. If there are any options defined on - // the target enum value they take precedence over the options. - optional EnumValueOptions options = 2; - } - - // List of message options to apply to some messages. - repeated AuxMessageOptions aux_message_options = 18; - - // List of message options to apply to some fields. - repeated AuxFieldOptions aux_field_options = 19; - - // List of message options to apply to some enums. - repeated AuxEnumOptions aux_enum_options = 20; - - // List of enum value options to apply to some enum values. - repeated AuxEnumValueOptions aux_enum_value_options = 22; - - // List of preprocessors to apply. - repeated string preprocessors = 24; - - repeated FieldTransformation field_transformations = 25; - - // Ignores all transformations for this file. This is meant to allow specific files to - // opt out from transformations inherited through package-scoped options. - optional bool ignore_all_transformations = 26; - - // If true, getters will be generated. - optional bool getters = 27 [default = true]; - - // For use in tests only. Inhibit Java conversions even when when generator parameters - // request for it. - optional bool test_only_no_java_conversions = 999; - - extensions 1000 to max; -} - -extend google.protobuf.FileOptions { - // File-level optionals for ScalaPB. - // Extension number officially assigned by protobuf-global-extension-registry@google.com - optional ScalaPbOptions options = 1020; -} - -message MessageOptions { - // Additional classes and traits to mix in to the case class. - repeated string extends = 1; - - // Additional classes and traits to mix in to the companion object. - repeated string companion_extends = 2; - - // Custom annotations to add to the generated case class. - repeated string annotations = 3; - - // All instances of this message will be converted to this type. An implicit TypeMapper - // must be present. - optional string type = 4; - - // Custom annotations to add to the companion object of the generated class. - repeated string companion_annotations = 5; - - // Additional classes and traits to mix in to generated sealed_oneof base trait. - repeated string sealed_oneof_extends = 6; - - // If true, when this message is used as an optional field, do not wrap it in an `Option`. - // This is equivalent of setting `(field).no_box` to true on each field with the message type. - optional bool no_box = 7; - - // Custom annotations to add to the generated `unknownFields` case class field. - repeated string unknown_fields_annotations = 8; - - extensions 1000 to max; -} - -extend google.protobuf.MessageOptions { - // Message-level optionals for ScalaPB. - // Extension number officially assigned by protobuf-global-extension-registry@google.com - optional MessageOptions message = 1020; -} - -// Represents a custom Collection type in Scala. This allows ScalaPB to integrate with -// collection types that are different enough from the ones in the standard library. -message Collection { - // Type of the collection - optional string type = 1; - - // Set to true if this collection type is not allowed to be empty, for example - // cats.data.NonEmptyList. When true, ScalaPB will not generate `clearX` for the repeated - // field and not provide a default argument in the constructor. - optional bool non_empty = 2; - - // An Adapter is a Scala object available at runtime that provides certain static methods - // that can operate on this collection type. - optional string adapter = 3; -} - -message FieldOptions { - optional string type = 1; - - optional string scala_name = 2; - - // Can be specified only if this field is repeated. If unspecified, - // it falls back to the file option named `collection_type`, which defaults - // to `scala.collection.Seq`. - optional string collection_type = 3; - - optional Collection collection = 8; - - // If the field is a map, you can specify custom Scala types for the key - // or value. - optional string key_type = 4; - optional string value_type = 5; - - // Custom annotations to add to the field. - repeated string annotations = 6; - - // Can be specified only if this field is a map. If unspecified, - // it falls back to the file option named `map_type` which defaults to - // `scala.collection.immutable.Map` - optional string map_type = 7; - - // Do not box this value in Option[T]. If set, this overrides MessageOptions.no_box - optional bool no_box = 30; - - // Like no_box it does not box a value in Option[T], but also fails parsing when a value - // is not provided. This enables to emulate required fields in proto3. - optional bool required = 31; - - extensions 1000 to max; -} - -extend google.protobuf.FieldOptions { - // Field-level optionals for ScalaPB. - // Extension number officially assigned by protobuf-global-extension-registry@google.com - optional FieldOptions field = 1020; -} - -message EnumOptions { - // Additional classes and traits to mix in to the base trait - repeated string extends = 1; - - // Additional classes and traits to mix in to the companion object. - repeated string companion_extends = 2; - - // All instances of this enum will be converted to this type. An implicit TypeMapper - // must be present. - optional string type = 3; - - // Custom annotations to add to the generated enum's base class. - repeated string base_annotations = 4; - - // Custom annotations to add to the generated trait. - repeated string recognized_annotations = 5; - - // Custom annotations to add to the generated Unrecognized case class. - repeated string unrecognized_annotations = 6; - - extensions 1000 to max; -} - -extend google.protobuf.EnumOptions { - // Enum-level optionals for ScalaPB. - // Extension number officially assigned by protobuf-global-extension-registry@google.com - // - // The field is called enum_options and not enum since enum is not allowed in Java. - optional EnumOptions enum_options = 1020; -} - -message EnumValueOptions { - // Additional classes and traits to mix in to an individual enum value. - repeated string extends = 1; - - // Name in Scala to use for this enum value. - optional string scala_name = 2; - - // Custom annotations to add to the generated case object for this enum value. - repeated string annotations = 3; - - extensions 1000 to max; -} - -extend google.protobuf.EnumValueOptions { - // Enum-level optionals for ScalaPB. - // Extension number officially assigned by protobuf-global-extension-registry@google.com - optional EnumValueOptions enum_value = 1020; -} - -message OneofOptions { - // Additional traits to mix in to a oneof. - repeated string extends = 1; - - // Name in Scala to use for this oneof field. - optional string scala_name = 2; - - extensions 1000 to max; -} - -extend google.protobuf.OneofOptions { - // Enum-level optionals for ScalaPB. - // Extension number officially assigned by protobuf-global-extension-registry@google.com - optional OneofOptions oneof = 1020; -} - -enum MatchType { - CONTAINS = 0; - EXACT = 1; - PRESENCE = 2; -} - -message FieldTransformation { - optional google.protobuf.FieldDescriptorProto when = 1; - optional MatchType match_type = 2 [default = CONTAINS]; - optional google.protobuf.FieldOptions set = 3; -} - -message PreprocessorOutput { - map options_by_file = 1; -} diff --git a/core/workflow-core/src/main/protobuf/scalapb/scalapb.proto b/core/workflow-core/src/main/protobuf/scalapb/scalapb.proto index bf58fe15204..d35d373f391 100644 --- a/core/workflow-core/src/main/protobuf/scalapb/scalapb.proto +++ b/core/workflow-core/src/main/protobuf/scalapb/scalapb.proto @@ -51,7 +51,7 @@ message ScalaPbOptions { // If set to true, all generated messages in this file will preserve unknown // fields. - optional bool preserve_unknown_fields = 9 [default=true]; + optional bool preserve_unknown_fields = 9 [default = true]; // If defined, sets the name of the file-level object that would be generated. This // object extends `GeneratedFileObject` and contains descriptors, and list of message @@ -70,7 +70,7 @@ message ScalaPbOptions { optional OptionsScope scope = 11; // If true, lenses will be generated. - optional bool lenses = 12 [default=true]; + optional bool lenses = 12 [default = true]; // If true, then source-code info information will be included in the // generated code - normally the source code info is cleared out to reduce @@ -94,7 +94,7 @@ message ScalaPbOptions { // Indicate if prefix (enum name + optional underscore) should be removed in scala code // Strip is applied before enum value naming changes. - optional bool enum_strip_prefix = 17 [default=false]; + optional bool enum_strip_prefix = 17 [default = false]; // Scala type to use for bytes fields. optional string bytes_type = 21; @@ -172,7 +172,7 @@ message ScalaPbOptions { optional bool ignore_all_transformations = 26; // If true, getters will be generated. - optional bool getters = 27 [default=true]; + optional bool getters = 27 [default = true]; // For use in tests only. Inhibit Java conversions even when when generator parameters // request for it. @@ -354,7 +354,7 @@ enum MatchType { message FieldTransformation { optional google.protobuf.FieldDescriptorProto when = 1; - optional MatchType match_type = 2 [default=CONTAINS]; + optional MatchType match_type = 2 [default = CONTAINS]; optional google.protobuf.FieldOptions set = 3; } From 4ea5fac9741eaa0dd3bf5a26e532a6d85de161d9 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Mon, 30 Dec 2024 17:28:57 -0800 Subject: [PATCH 21/47] Fix python proto gen (#3184) The Python protobuf-generated code was outdated. This PR updates the generation script to include all protobuf definitions from the workflow-core and amber sub-projects, ensuring that the latest Python code is generated and aligned with the current protobuf definitions. --- .../actorcommand/backpressure_handler.py | 2 +- .../architecture/managers/pause_manager.py | 2 +- .../managers/statistics_manager.py | 2 +- .../managers/tuple_processing_manager.py | 2 +- .../architecture/packaging/input_manager.py | 2 +- .../architecture/packaging/output_manager.py | 2 +- .../core/architecture/rpc/async_rpc_client.py | 6 +- .../core/architecture/rpc/async_rpc_server.py | 6 +- .../sendsemantics/broad_cast_partitioner.py | 2 +- .../hash_based_shuffle_partitioner.py | 2 +- .../sendsemantics/one_to_one_partitioner.py | 2 +- .../architecture/sendsemantics/partitioner.py | 2 +- .../range_based_shuffle_partitioner.py | 2 +- .../sendsemantics/round_robin_partitioner.py | 2 +- .../python/core/models/internal_marker.py | 2 +- .../main/python/core/models/internal_queue.py | 3 +- .../main/python/core/runnables/main_loop.py | 7 +- .../python/core/runnables/network_sender.py | 2 +- .../core/runnables/test_console_message.py | 2 +- .../python/core/runnables/test_main_loop.py | 4 +- .../core/runnables/test_network_receiver.py | 6 +- .../proto/edu/uci/ics/amber/core/__init__.py | 108 ++++++++++++ .../amber/engine/architecture/rpc/__init__.py | 55 +++--- .../architecture/sendsemantics/__init__.py | 12 +- .../engine/architecture/worker/__init__.py | 4 +- .../uci/ics/amber/engine/common/__init__.py | 94 +--------- .../src/main/python/proto/scalapb/__init__.py | 164 +++++++++--------- core/scripts/python-proto-gen.sh | 11 +- 28 files changed, 265 insertions(+), 245 deletions(-) create mode 100644 core/amber/src/main/python/proto/edu/uci/ics/amber/core/__init__.py diff --git a/core/amber/src/main/python/core/architecture/handlers/actorcommand/backpressure_handler.py b/core/amber/src/main/python/core/architecture/handlers/actorcommand/backpressure_handler.py index e1171cf8870..d96331795fe 100644 --- a/core/amber/src/main/python/core/architecture/handlers/actorcommand/backpressure_handler.py +++ b/core/amber/src/main/python/core/architecture/handlers/actorcommand/backpressure_handler.py @@ -13,9 +13,9 @@ from proto.edu.uci.ics.amber.engine.common import ( Backpressure, - ActorVirtualIdentity, ControlPayloadV2, ) +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class BackpressureHandler(ActorCommandHandler): diff --git a/core/amber/src/main/python/core/architecture/managers/pause_manager.py b/core/amber/src/main/python/core/architecture/managers/pause_manager.py index 4ddf2b7a85d..7ecc2631a18 100644 --- a/core/amber/src/main/python/core/architecture/managers/pause_manager.py +++ b/core/amber/src/main/python/core/architecture/managers/pause_manager.py @@ -7,7 +7,7 @@ from . import state_manager from proto.edu.uci.ics.amber.engine.architecture.worker import WorkerState -from proto.edu.uci.ics.amber.engine.common import ActorVirtualIdentity +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity from ...models import InternalQueue diff --git a/core/amber/src/main/python/core/architecture/managers/statistics_manager.py b/core/amber/src/main/python/core/architecture/managers/statistics_manager.py index f615f66eb53..0dead5a347a 100644 --- a/core/amber/src/main/python/core/architecture/managers/statistics_manager.py +++ b/core/amber/src/main/python/core/architecture/managers/statistics_manager.py @@ -1,7 +1,7 @@ from typing import Dict from collections import defaultdict -from proto.edu.uci.ics.amber.engine.common import PortIdentity +from proto.edu.uci.ics.amber.core import PortIdentity from proto.edu.uci.ics.amber.engine.architecture.worker import ( WorkerStatistics, PortTupleCountMapping, diff --git a/core/amber/src/main/python/core/architecture/managers/tuple_processing_manager.py b/core/amber/src/main/python/core/architecture/managers/tuple_processing_manager.py index c217d5fe372..0b4ddb6871b 100644 --- a/core/amber/src/main/python/core/architecture/managers/tuple_processing_manager.py +++ b/core/amber/src/main/python/core/architecture/managers/tuple_processing_manager.py @@ -1,7 +1,7 @@ from threading import Event, Condition from typing import Optional, Tuple, Iterator -from proto.edu.uci.ics.amber.engine.common import PortIdentity +from proto.edu.uci.ics.amber.core import PortIdentity class TupleProcessingManager: diff --git a/core/amber/src/main/python/core/architecture/packaging/input_manager.py b/core/amber/src/main/python/core/architecture/packaging/input_manager.py index 4a50fccea81..1c52e797aac 100644 --- a/core/amber/src/main/python/core/architecture/packaging/input_manager.py +++ b/core/amber/src/main/python/core/architecture/packaging/input_manager.py @@ -11,7 +11,7 @@ ) from core.models.marker import EndOfInputChannel, State, StartOfInputChannel, Marker from core.models.payload import DataFrame, DataPayload, MarkerFrame -from proto.edu.uci.ics.amber.engine.common import ( +from proto.edu.uci.ics.amber.core import ( ActorVirtualIdentity, PortIdentity, ChannelIdentity, diff --git a/core/amber/src/main/python/core/architecture/packaging/output_manager.py b/core/amber/src/main/python/core/architecture/packaging/output_manager.py index e7592e0ab45..bdeac6bc367 100644 --- a/core/amber/src/main/python/core/architecture/packaging/output_manager.py +++ b/core/amber/src/main/python/core/architecture/packaging/output_manager.py @@ -32,7 +32,7 @@ RangeBasedShufflePartitioning, BroadcastPartitioning, ) -from proto.edu.uci.ics.amber.engine.common import ( +from proto.edu.uci.ics.amber.core import ( ActorVirtualIdentity, PhysicalLink, PortIdentity, diff --git a/core/amber/src/main/python/core/architecture/rpc/async_rpc_client.py b/core/amber/src/main/python/core/architecture/rpc/async_rpc_client.py index 6bd7c8a9cfb..91e4e4186e6 100644 --- a/core/amber/src/main/python/core/architecture/rpc/async_rpc_client.py +++ b/core/amber/src/main/python/core/architecture/rpc/async_rpc_client.py @@ -18,10 +18,8 @@ WorkerServiceStub, ControlRequest, ) -from proto.edu.uci.ics.amber.engine.common import ( - ActorVirtualIdentity, - ControlPayloadV2, -) +from proto.edu.uci.ics.amber.engine.common import ControlPayloadV2 +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity R = TypeVar("R") diff --git a/core/amber/src/main/python/core/architecture/rpc/async_rpc_server.py b/core/amber/src/main/python/core/architecture/rpc/async_rpc_server.py index b214a0d5f2d..727c7dd6ac9 100644 --- a/core/amber/src/main/python/core/architecture/rpc/async_rpc_server.py +++ b/core/amber/src/main/python/core/architecture/rpc/async_rpc_server.py @@ -15,10 +15,8 @@ ControlError, ErrorLanguage, ) -from proto.edu.uci.ics.amber.engine.common import ( - ActorVirtualIdentity, - ControlPayloadV2, -) +from proto.edu.uci.ics.amber.engine.common import ControlPayloadV2 +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class AsyncRPCServer: diff --git a/core/amber/src/main/python/core/architecture/sendsemantics/broad_cast_partitioner.py b/core/amber/src/main/python/core/architecture/sendsemantics/broad_cast_partitioner.py index 407172975f0..c5ca0fad369 100644 --- a/core/amber/src/main/python/core/architecture/sendsemantics/broad_cast_partitioner.py +++ b/core/amber/src/main/python/core/architecture/sendsemantics/broad_cast_partitioner.py @@ -11,7 +11,7 @@ Partitioning, BroadcastPartitioning, ) -from proto.edu.uci.ics.amber.engine.common import ActorVirtualIdentity +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class BroadcastPartitioner(Partitioner): diff --git a/core/amber/src/main/python/core/architecture/sendsemantics/hash_based_shuffle_partitioner.py b/core/amber/src/main/python/core/architecture/sendsemantics/hash_based_shuffle_partitioner.py index f4e0942768c..775bd94b028 100644 --- a/core/amber/src/main/python/core/architecture/sendsemantics/hash_based_shuffle_partitioner.py +++ b/core/amber/src/main/python/core/architecture/sendsemantics/hash_based_shuffle_partitioner.py @@ -11,7 +11,7 @@ HashBasedShufflePartitioning, Partitioning, ) -from proto.edu.uci.ics.amber.engine.common import ActorVirtualIdentity +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class HashBasedShufflePartitioner(Partitioner): diff --git a/core/amber/src/main/python/core/architecture/sendsemantics/one_to_one_partitioner.py b/core/amber/src/main/python/core/architecture/sendsemantics/one_to_one_partitioner.py index 1758363c0cb..81e623ab6a0 100644 --- a/core/amber/src/main/python/core/architecture/sendsemantics/one_to_one_partitioner.py +++ b/core/amber/src/main/python/core/architecture/sendsemantics/one_to_one_partitioner.py @@ -10,7 +10,7 @@ OneToOnePartitioning, Partitioning, ) -from proto.edu.uci.ics.amber.engine.common import ActorVirtualIdentity +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class OneToOnePartitioner(Partitioner): diff --git a/core/amber/src/main/python/core/architecture/sendsemantics/partitioner.py b/core/amber/src/main/python/core/architecture/sendsemantics/partitioner.py index e2ff2df34c7..2220b350abc 100644 --- a/core/amber/src/main/python/core/architecture/sendsemantics/partitioner.py +++ b/core/amber/src/main/python/core/architecture/sendsemantics/partitioner.py @@ -8,7 +8,7 @@ from core.models.marker import Marker from core.util import get_one_of from proto.edu.uci.ics.amber.engine.architecture.sendsemantics import Partitioning -from proto.edu.uci.ics.amber.engine.common import ActorVirtualIdentity +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class Partitioner(ABC): diff --git a/core/amber/src/main/python/core/architecture/sendsemantics/range_based_shuffle_partitioner.py b/core/amber/src/main/python/core/architecture/sendsemantics/range_based_shuffle_partitioner.py index 31d0ccc6f87..dee786c649a 100644 --- a/core/amber/src/main/python/core/architecture/sendsemantics/range_based_shuffle_partitioner.py +++ b/core/amber/src/main/python/core/architecture/sendsemantics/range_based_shuffle_partitioner.py @@ -12,7 +12,7 @@ RangeBasedShufflePartitioning, Partitioning, ) -from proto.edu.uci.ics.amber.engine.common import ActorVirtualIdentity +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class RangeBasedShufflePartitioner(Partitioner): diff --git a/core/amber/src/main/python/core/architecture/sendsemantics/round_robin_partitioner.py b/core/amber/src/main/python/core/architecture/sendsemantics/round_robin_partitioner.py index 47011051f4e..4baa1463193 100644 --- a/core/amber/src/main/python/core/architecture/sendsemantics/round_robin_partitioner.py +++ b/core/amber/src/main/python/core/architecture/sendsemantics/round_robin_partitioner.py @@ -11,7 +11,7 @@ Partitioning, RoundRobinPartitioning, ) -from proto.edu.uci.ics.amber.engine.common import ActorVirtualIdentity +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class RoundRobinPartitioner(Partitioner): diff --git a/core/amber/src/main/python/core/models/internal_marker.py b/core/amber/src/main/python/core/models/internal_marker.py index 78ed5c60513..1f21f731d79 100644 --- a/core/amber/src/main/python/core/models/internal_marker.py +++ b/core/amber/src/main/python/core/models/internal_marker.py @@ -1,6 +1,6 @@ from dataclasses import dataclass from core.models.marker import Marker -from proto.edu.uci.ics.amber.engine.common import ChannelIdentity +from proto.edu.uci.ics.amber.core import ChannelIdentity @dataclass diff --git a/core/amber/src/main/python/core/models/internal_queue.py b/core/amber/src/main/python/core/models/internal_queue.py index 36e271983e8..ae22ba134e9 100644 --- a/core/amber/src/main/python/core/models/internal_queue.py +++ b/core/amber/src/main/python/core/models/internal_queue.py @@ -11,7 +11,8 @@ LinkedBlockingMultiQueue, ) from core.util.customized_queue.queue_base import IQueue, QueueElement -from proto.edu.uci.ics.amber.engine.common import ActorVirtualIdentity, ControlPayloadV2 +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity +from proto.edu.uci.ics.amber.engine.common import ControlPayloadV2 @dataclass diff --git a/core/amber/src/main/python/core/runnables/main_loop.py b/core/amber/src/main/python/core/runnables/main_loop.py index 845afe0d21a..0b66450162d 100644 --- a/core/amber/src/main/python/core/runnables/main_loop.py +++ b/core/amber/src/main/python/core/runnables/main_loop.py @@ -39,11 +39,8 @@ from proto.edu.uci.ics.amber.engine.architecture.worker import ( WorkerState, ) -from proto.edu.uci.ics.amber.engine.common import ( - ActorVirtualIdentity, - ControlPayloadV2, - PortIdentity, -) +from proto.edu.uci.ics.amber.engine.common import ControlPayloadV2 +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity, PortIdentity class MainLoop(StoppableQueueBlockingRunnable): diff --git a/core/amber/src/main/python/core/runnables/network_sender.py b/core/amber/src/main/python/core/runnables/network_sender.py index 031f2783902..5ce2c7c7f95 100644 --- a/core/amber/src/main/python/core/runnables/network_sender.py +++ b/core/amber/src/main/python/core/runnables/network_sender.py @@ -9,11 +9,11 @@ from core.proxy import ProxyClient from core.util import StoppableQueueBlockingRunnable from proto.edu.uci.ics.amber.engine.common import ( - ActorVirtualIdentity, ControlPayloadV2, PythonControlMessage, PythonDataHeader, ) +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class NetworkSender(StoppableQueueBlockingRunnable): diff --git a/core/amber/src/main/python/core/runnables/test_console_message.py b/core/amber/src/main/python/core/runnables/test_console_message.py index a643a789855..2ff4373f7d7 100644 --- a/core/amber/src/main/python/core/runnables/test_console_message.py +++ b/core/amber/src/main/python/core/runnables/test_console_message.py @@ -10,10 +10,10 @@ ConsoleMessageType, ) from proto.edu.uci.ics.amber.engine.common import ( - ActorVirtualIdentity, ControlPayloadV2, PythonControlMessage, ) +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class TestConsoleMessage: diff --git a/core/amber/src/main/python/core/runnables/test_main_loop.py b/core/amber/src/main/python/core/runnables/test_main_loop.py index 05cfdf9362b..77981fade9b 100644 --- a/core/amber/src/main/python/core/runnables/test_main_loop.py +++ b/core/amber/src/main/python/core/runnables/test_main_loop.py @@ -42,15 +42,15 @@ WorkerStatistics, PortTupleCountMapping, ) -from proto.edu.uci.ics.amber.engine.common import ( +from proto.edu.uci.ics.amber.core import ( ActorVirtualIdentity, - ControlPayloadV2, PhysicalLink, PhysicalOpIdentity, OperatorIdentity, ChannelIdentity, PortIdentity, ) +from proto.edu.uci.ics.amber.engine.common import ControlPayloadV2 from pytexera.udf.examples.count_batch_operator import CountBatchOperator from pytexera.udf.examples.echo_operator import EchoOperator from google.protobuf.any_pb2 import Any as ProtoAny diff --git a/core/amber/src/main/python/core/runnables/test_network_receiver.py b/core/amber/src/main/python/core/runnables/test_network_receiver.py index f2ca1d640c8..cfba03c7fee 100644 --- a/core/amber/src/main/python/core/runnables/test_network_receiver.py +++ b/core/amber/src/main/python/core/runnables/test_network_receiver.py @@ -11,10 +11,8 @@ from core.runnables.network_sender import NetworkSender from core.util.proto import set_one_of from proto.edu.uci.ics.amber.engine.architecture.rpc import ControlInvocation -from proto.edu.uci.ics.amber.engine.common import ( - ActorVirtualIdentity, - ControlPayloadV2, -) +from proto.edu.uci.ics.amber.engine.common import ControlPayloadV2 +from proto.edu.uci.ics.amber.core import ActorVirtualIdentity class TestNetworkReceiver: diff --git a/core/amber/src/main/python/proto/edu/uci/ics/amber/core/__init__.py b/core/amber/src/main/python/proto/edu/uci/ics/amber/core/__init__.py new file mode 100644 index 00000000000..0cb9940da1d --- /dev/null +++ b/core/amber/src/main/python/proto/edu/uci/ics/amber/core/__init__.py @@ -0,0 +1,108 @@ +# Generated by the protocol buffer compiler. DO NOT EDIT! +# sources: edu/uci/ics/amber/core/virtualidentity.proto, edu/uci/ics/amber/core/workflow.proto, edu/uci/ics/amber/core/workflowruntimestate.proto +# plugin: python-betterproto +# This file has been @generated + +from dataclasses import dataclass +from datetime import datetime +from typing import List + +import betterproto + + +class OutputPortOutputMode(betterproto.Enum): + SET_SNAPSHOT = 0 + """outputs complete result set snapshot for each update""" + + SET_DELTA = 1 + """outputs incremental result set delta for each update""" + + SINGLE_SNAPSHOT = 2 + """ + outputs a single snapshot for the entire execution, + used explicitly to support visualization operators that may exceed the memory limit + TODO: remove this mode after we have a better solution for output size limit + """ + + +class FatalErrorType(betterproto.Enum): + COMPILATION_ERROR = 0 + EXECUTION_FAILURE = 1 + + +@dataclass(eq=False, repr=False) +class WorkflowIdentity(betterproto.Message): + id: int = betterproto.int64_field(1) + + +@dataclass(eq=False, repr=False) +class ExecutionIdentity(betterproto.Message): + id: int = betterproto.int64_field(1) + + +@dataclass(eq=False, repr=False) +class ActorVirtualIdentity(betterproto.Message): + name: str = betterproto.string_field(1) + + +@dataclass(eq=False, repr=False) +class ChannelIdentity(betterproto.Message): + from_worker_id: "ActorVirtualIdentity" = betterproto.message_field(1) + to_worker_id: "ActorVirtualIdentity" = betterproto.message_field(2) + is_control: bool = betterproto.bool_field(3) + + +@dataclass(eq=False, repr=False) +class OperatorIdentity(betterproto.Message): + id: str = betterproto.string_field(1) + + +@dataclass(eq=False, repr=False) +class PhysicalOpIdentity(betterproto.Message): + logical_op_id: "OperatorIdentity" = betterproto.message_field(1) + layer_name: str = betterproto.string_field(2) + + +@dataclass(eq=False, repr=False) +class ChannelMarkerIdentity(betterproto.Message): + id: str = betterproto.string_field(1) + + +@dataclass(eq=False, repr=False) +class PortIdentity(betterproto.Message): + id: int = betterproto.int32_field(1) + internal: bool = betterproto.bool_field(2) + + +@dataclass(eq=False, repr=False) +class InputPort(betterproto.Message): + id: "PortIdentity" = betterproto.message_field(1) + display_name: str = betterproto.string_field(2) + allow_multi_links: bool = betterproto.bool_field(3) + dependencies: List["PortIdentity"] = betterproto.message_field(4) + + +@dataclass(eq=False, repr=False) +class OutputPort(betterproto.Message): + id: "PortIdentity" = betterproto.message_field(1) + display_name: str = betterproto.string_field(2) + blocking: bool = betterproto.bool_field(3) + mode: "OutputPortOutputMode" = betterproto.enum_field(4) + + +@dataclass(eq=False, repr=False) +class PhysicalLink(betterproto.Message): + from_op_id: "PhysicalOpIdentity" = betterproto.message_field(1) + from_port_id: "PortIdentity" = betterproto.message_field(2) + to_op_id: "PhysicalOpIdentity" = betterproto.message_field(3) + to_port_id: "PortIdentity" = betterproto.message_field(4) + + +@dataclass(eq=False, repr=False) +class WorkflowFatalError(betterproto.Message): + type: "FatalErrorType" = betterproto.enum_field(1) + timestamp: datetime = betterproto.message_field(2) + message: str = betterproto.string_field(3) + details: str = betterproto.string_field(4) + operator_id: str = betterproto.string_field(5) + worker_id: str = betterproto.string_field(6) diff --git a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/rpc/__init__.py b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/rpc/__init__.py index e2e066fff2e..676292d9605 100644 --- a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/rpc/__init__.py +++ b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/rpc/__init__.py @@ -17,7 +17,7 @@ import grpclib from betterproto.grpc.grpclib_server import ServiceBase -from ... import common as __common__ +from .... import core as ___core__ from .. import ( sendsemantics as _sendsemantics__, worker as _worker__, @@ -146,8 +146,8 @@ class EmptyRequest(betterproto.Message): @dataclass(eq=False, repr=False) class AsyncRpcContext(betterproto.Message): - sender: "__common__.ActorVirtualIdentity" = betterproto.message_field(1) - receiver: "__common__.ActorVirtualIdentity" = betterproto.message_field(2) + sender: "___core__.ActorVirtualIdentity" = betterproto.message_field(1) + receiver: "___core__.ActorVirtualIdentity" = betterproto.message_field(2) @dataclass(eq=False, repr=False) @@ -162,9 +162,9 @@ class ControlInvocation(betterproto.Message): class ChannelMarkerPayload(betterproto.Message): """Message for ChannelMarkerPayload""" - id: "__common__.ChannelMarkerIdentity" = betterproto.message_field(1) + id: "___core__.ChannelMarkerIdentity" = betterproto.message_field(1) marker_type: "ChannelMarkerType" = betterproto.enum_field(2) - scope: List["__common__.ChannelIdentity"] = betterproto.message_field(3) + scope: List["___core__.ChannelIdentity"] = betterproto.message_field(3) command_mapping: Dict[str, "ControlInvocation"] = betterproto.map_field( 4, betterproto.TYPE_STRING, betterproto.TYPE_MESSAGE ) @@ -172,13 +172,13 @@ class ChannelMarkerPayload(betterproto.Message): @dataclass(eq=False, repr=False) class PropagateChannelMarkerRequest(betterproto.Message): - source_op_to_start_prop: List["__common__.PhysicalOpIdentity"] = ( + source_op_to_start_prop: List["___core__.PhysicalOpIdentity"] = ( betterproto.message_field(1) ) - id: "__common__.ChannelMarkerIdentity" = betterproto.message_field(2) + id: "___core__.ChannelMarkerIdentity" = betterproto.message_field(2) marker_type: "ChannelMarkerType" = betterproto.enum_field(3) - scope: List["__common__.PhysicalOpIdentity"] = betterproto.message_field(4) - target_ops: List["__common__.PhysicalOpIdentity"] = betterproto.message_field(5) + scope: List["___core__.PhysicalOpIdentity"] = betterproto.message_field(4) + target_ops: List["___core__.PhysicalOpIdentity"] = betterproto.message_field(5) marker_command: "ControlRequest" = betterproto.message_field(6) marker_method_name: str = betterproto.string_field(7) @@ -186,7 +186,7 @@ class PropagateChannelMarkerRequest(betterproto.Message): @dataclass(eq=False, repr=False) class TakeGlobalCheckpointRequest(betterproto.Message): estimation_only: bool = betterproto.bool_field(1) - checkpoint_id: "__common__.ChannelMarkerIdentity" = betterproto.message_field(2) + checkpoint_id: "___core__.ChannelMarkerIdentity" = betterproto.message_field(2) destination: str = betterproto.string_field(3) @@ -215,7 +215,7 @@ class ModifyLogicRequest(betterproto.Message): @dataclass(eq=False, repr=False) class RetryWorkflowRequest(betterproto.Message): - workers: List["__common__.ActorVirtualIdentity"] = betterproto.message_field(1) + workers: List["___core__.ActorVirtualIdentity"] = betterproto.message_field(1) @dataclass(eq=False, repr=False) @@ -235,7 +235,7 @@ class ConsoleMessageTriggeredRequest(betterproto.Message): @dataclass(eq=False, repr=False) class PortCompletedRequest(betterproto.Message): - port_id: "__common__.PortIdentity" = betterproto.message_field(1) + port_id: "___core__.PortIdentity" = betterproto.message_field(1) input: bool = betterproto.bool_field(2) @@ -246,7 +246,7 @@ class WorkerStateUpdatedRequest(betterproto.Message): @dataclass(eq=False, repr=False) class LinkWorkersRequest(betterproto.Message): - link: "__common__.PhysicalLink" = betterproto.message_field(1) + link: "___core__.PhysicalLink" = betterproto.message_field(1) @dataclass(eq=False, repr=False) @@ -255,7 +255,7 @@ class Ping(betterproto.Message): i: int = betterproto.int32_field(1) end: int = betterproto.int32_field(2) - to: "__common__.ActorVirtualIdentity" = betterproto.message_field(3) + to: "___core__.ActorVirtualIdentity" = betterproto.message_field(3) @dataclass(eq=False, repr=False) @@ -264,7 +264,7 @@ class Pong(betterproto.Message): i: int = betterproto.int32_field(1) end: int = betterproto.int32_field(2) - to: "__common__.ActorVirtualIdentity" = betterproto.message_field(3) + to: "___core__.ActorVirtualIdentity" = betterproto.message_field(3) @dataclass(eq=False, repr=False) @@ -285,7 +285,7 @@ class Nested(betterproto.Message): class MultiCall(betterproto.Message): """MultiCall message""" - seq: List["__common__.ActorVirtualIdentity"] = betterproto.message_field(1) + seq: List["___core__.ActorVirtualIdentity"] = betterproto.message_field(1) @dataclass(eq=False, repr=False) @@ -299,7 +299,7 @@ class ErrorCommand(betterproto.Message): class Collect(betterproto.Message): """Collect message""" - workers: List["__common__.ActorVirtualIdentity"] = betterproto.message_field(1) + workers: List["___core__.ActorVirtualIdentity"] = betterproto.message_field(1) @dataclass(eq=False, repr=False) @@ -313,7 +313,7 @@ class GenerateNumber(betterproto.Message): class Chain(betterproto.Message): """Chain message""" - nexts: List["__common__.ActorVirtualIdentity"] = betterproto.message_field(1) + nexts: List["___core__.ActorVirtualIdentity"] = betterproto.message_field(1) @dataclass(eq=False, repr=False) @@ -327,19 +327,19 @@ class Recursion(betterproto.Message): class AddInputChannelRequest(betterproto.Message): """Messages for the commands""" - channel_id: "__common__.ChannelIdentity" = betterproto.message_field(1) - port_id: "__common__.PortIdentity" = betterproto.message_field(2) + channel_id: "___core__.ChannelIdentity" = betterproto.message_field(1) + port_id: "___core__.PortIdentity" = betterproto.message_field(2) @dataclass(eq=False, repr=False) class AddPartitioningRequest(betterproto.Message): - tag: "__common__.PhysicalLink" = betterproto.message_field(1) + tag: "___core__.PhysicalLink" = betterproto.message_field(1) partitioning: "_sendsemantics__.Partitioning" = betterproto.message_field(2) @dataclass(eq=False, repr=False) class AssignPortRequest(betterproto.Message): - port_id: "__common__.PortIdentity" = betterproto.message_field(1) + port_id: "___core__.PortIdentity" = betterproto.message_field(1) input: bool = betterproto.bool_field(2) schema: Dict[str, str] = betterproto.map_field( 3, betterproto.TYPE_STRING, betterproto.TYPE_STRING @@ -348,7 +348,7 @@ class AssignPortRequest(betterproto.Message): @dataclass(eq=False, repr=False) class FinalizeCheckpointRequest(betterproto.Message): - checkpoint_id: "__common__.ChannelMarkerIdentity" = betterproto.message_field(1) + checkpoint_id: "___core__.ChannelMarkerIdentity" = betterproto.message_field(1) write_to: str = betterproto.string_field(2) @@ -364,7 +364,7 @@ class InitializeExecutorRequest(betterproto.Message): @dataclass(eq=False, repr=False) class UpdateExecutorRequest(betterproto.Message): - target_op_id: "__common__.PhysicalOpIdentity" = betterproto.message_field(1) + target_op_id: "___core__.PhysicalOpIdentity" = betterproto.message_field(1) new_executor: "betterproto_lib_google_protobuf.Any" = betterproto.message_field(2) state_transfer_func: "betterproto_lib_google_protobuf.Any" = ( betterproto.message_field(3) @@ -373,13 +373,13 @@ class UpdateExecutorRequest(betterproto.Message): @dataclass(eq=False, repr=False) class PrepareCheckpointRequest(betterproto.Message): - checkpoint_id: "__common__.ChannelMarkerIdentity" = betterproto.message_field(1) + checkpoint_id: "___core__.ChannelMarkerIdentity" = betterproto.message_field(1) estimation_only: bool = betterproto.bool_field(2) @dataclass(eq=False, repr=False) class QueryStatisticsRequest(betterproto.Message): - filter_by_workers: List["__common__.ActorVirtualIdentity"] = ( + filter_by_workers: List["___core__.ActorVirtualIdentity"] = ( betterproto.message_field(1) ) @@ -1235,6 +1235,7 @@ async def retry_workflow( class RpcTesterBase(ServiceBase): + async def send_ping(self, ping: "Ping") -> "IntResponse": raise grpclib.GRPCError(grpclib.const.Status.UNIMPLEMENTED) @@ -1405,6 +1406,7 @@ def __mapping__(self) -> Dict[str, grpclib.const.Handler]: class WorkerServiceBase(ServiceBase): + async def add_input_channel( self, add_input_channel_request: "AddInputChannelRequest" ) -> "EmptyReturn": @@ -1711,6 +1713,7 @@ def __mapping__(self) -> Dict[str, grpclib.const.Handler]: class ControllerServiceBase(ServiceBase): + async def retrieve_workflow_state( self, empty_request: "EmptyRequest" ) -> "RetrieveWorkflowStateResponse": diff --git a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/sendsemantics/__init__.py b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/sendsemantics/__init__.py index b862b7ea3c1..b9769dc2bb9 100644 --- a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/sendsemantics/__init__.py +++ b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/sendsemantics/__init__.py @@ -8,7 +8,7 @@ import betterproto -from ... import common as __common__ +from .... import core as ___core__ @dataclass(eq=False, repr=False) @@ -33,26 +33,26 @@ class Partitioning(betterproto.Message): @dataclass(eq=False, repr=False) class OneToOnePartitioning(betterproto.Message): batch_size: int = betterproto.int32_field(1) - channels: List["__common__.ChannelIdentity"] = betterproto.message_field(2) + channels: List["___core__.ChannelIdentity"] = betterproto.message_field(2) @dataclass(eq=False, repr=False) class RoundRobinPartitioning(betterproto.Message): batch_size: int = betterproto.int32_field(1) - channels: List["__common__.ChannelIdentity"] = betterproto.message_field(2) + channels: List["___core__.ChannelIdentity"] = betterproto.message_field(2) @dataclass(eq=False, repr=False) class HashBasedShufflePartitioning(betterproto.Message): batch_size: int = betterproto.int32_field(1) - channels: List["__common__.ChannelIdentity"] = betterproto.message_field(2) + channels: List["___core__.ChannelIdentity"] = betterproto.message_field(2) hash_attribute_names: List[str] = betterproto.string_field(3) @dataclass(eq=False, repr=False) class RangeBasedShufflePartitioning(betterproto.Message): batch_size: int = betterproto.int32_field(1) - channels: List["__common__.ChannelIdentity"] = betterproto.message_field(2) + channels: List["___core__.ChannelIdentity"] = betterproto.message_field(2) range_attribute_names: List[str] = betterproto.string_field(3) range_min: int = betterproto.int64_field(4) range_max: int = betterproto.int64_field(5) @@ -61,4 +61,4 @@ class RangeBasedShufflePartitioning(betterproto.Message): @dataclass(eq=False, repr=False) class BroadcastPartitioning(betterproto.Message): batch_size: int = betterproto.int32_field(1) - channels: List["__common__.ChannelIdentity"] = betterproto.message_field(2) + channels: List["___core__.ChannelIdentity"] = betterproto.message_field(2) diff --git a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/worker/__init__.py b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/worker/__init__.py index 4f7c35a6e96..344972b1060 100644 --- a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/worker/__init__.py +++ b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/worker/__init__.py @@ -8,7 +8,7 @@ import betterproto -from ... import common as __common__ +from .... import core as ___core__ class WorkerState(betterproto.Enum): @@ -21,7 +21,7 @@ class WorkerState(betterproto.Enum): @dataclass(eq=False, repr=False) class PortTupleCountMapping(betterproto.Message): - port_id: "__common__.PortIdentity" = betterproto.message_field(1) + port_id: "___core__.PortIdentity" = betterproto.message_field(1) tuple_count: int = betterproto.int64_field(2) diff --git a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/common/__init__.py b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/common/__init__.py index 1c38e3cc6cf..7d1e19c8f8e 100644 --- a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/common/__init__.py +++ b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/common/__init__.py @@ -1,10 +1,9 @@ # Generated by the protocol buffer compiler. DO NOT EDIT! -# sources: edu/uci/ics/amber/engine/common/actormessage.proto, edu/uci/ics/amber/engine/common/ambermessage.proto, edu/uci/ics/amber/engine/common/virtualidentity.proto, edu/uci/ics/amber/engine/common/workflow.proto, edu/uci/ics/amber/engine/common/workflowruntimestate.proto +# sources: edu/uci/ics/amber/engine/common/actormessage.proto, edu/uci/ics/amber/engine/common/ambermessage.proto, edu/uci/ics/amber/engine/common/executionruntimestate.proto # plugin: python-betterproto # This file has been @generated from dataclasses import dataclass -from datetime import datetime from typing import ( Dict, List, @@ -12,84 +11,13 @@ import betterproto +from ... import core as __core__ from ..architecture import ( rpc as _architecture_rpc__, worker as _architecture_worker__, ) -class FatalErrorType(betterproto.Enum): - COMPILATION_ERROR = 0 - EXECUTION_FAILURE = 1 - - -@dataclass(eq=False, repr=False) -class WorkflowIdentity(betterproto.Message): - id: int = betterproto.int64_field(1) - - -@dataclass(eq=False, repr=False) -class ExecutionIdentity(betterproto.Message): - id: int = betterproto.int64_field(1) - - -@dataclass(eq=False, repr=False) -class ActorVirtualIdentity(betterproto.Message): - name: str = betterproto.string_field(1) - - -@dataclass(eq=False, repr=False) -class ChannelIdentity(betterproto.Message): - from_worker_id: "ActorVirtualIdentity" = betterproto.message_field(1) - to_worker_id: "ActorVirtualIdentity" = betterproto.message_field(2) - is_control: bool = betterproto.bool_field(3) - - -@dataclass(eq=False, repr=False) -class OperatorIdentity(betterproto.Message): - id: str = betterproto.string_field(1) - - -@dataclass(eq=False, repr=False) -class PhysicalOpIdentity(betterproto.Message): - logical_op_id: "OperatorIdentity" = betterproto.message_field(1) - layer_name: str = betterproto.string_field(2) - - -@dataclass(eq=False, repr=False) -class ChannelMarkerIdentity(betterproto.Message): - id: str = betterproto.string_field(1) - - -@dataclass(eq=False, repr=False) -class PortIdentity(betterproto.Message): - id: int = betterproto.int32_field(1) - internal: bool = betterproto.bool_field(2) - - -@dataclass(eq=False, repr=False) -class InputPort(betterproto.Message): - id: "PortIdentity" = betterproto.message_field(1) - display_name: str = betterproto.string_field(2) - allow_multi_links: bool = betterproto.bool_field(3) - dependencies: List["PortIdentity"] = betterproto.message_field(4) - - -@dataclass(eq=False, repr=False) -class OutputPort(betterproto.Message): - id: "PortIdentity" = betterproto.message_field(1) - display_name: str = betterproto.string_field(2) - blocking: bool = betterproto.bool_field(3) - - -@dataclass(eq=False, repr=False) -class PhysicalLink(betterproto.Message): - from_op_id: "PhysicalOpIdentity" = betterproto.message_field(1) - from_port_id: "PortIdentity" = betterproto.message_field(2) - to_op_id: "PhysicalOpIdentity" = betterproto.message_field(3) - to_port_id: "PortIdentity" = betterproto.message_field(4) - - @dataclass(eq=False, repr=False) class ControlPayloadV2(betterproto.Message): control_invocation: "_architecture_rpc__.ControlInvocation" = ( @@ -102,13 +30,13 @@ class ControlPayloadV2(betterproto.Message): @dataclass(eq=False, repr=False) class PythonDataHeader(betterproto.Message): - tag: "ActorVirtualIdentity" = betterproto.message_field(1) + tag: "__core__.ActorVirtualIdentity" = betterproto.message_field(1) payload_type: str = betterproto.string_field(2) @dataclass(eq=False, repr=False) class PythonControlMessage(betterproto.Message): - tag: "ActorVirtualIdentity" = betterproto.message_field(1) + tag: "__core__.ActorVirtualIdentity" = betterproto.message_field(1) payload: "ControlPayloadV2" = betterproto.message_field(2) @@ -199,21 +127,11 @@ class ExecutionStatsStore(betterproto.Message): ) -@dataclass(eq=False, repr=False) -class WorkflowFatalError(betterproto.Message): - type: "FatalErrorType" = betterproto.enum_field(1) - timestamp: datetime = betterproto.message_field(2) - message: str = betterproto.string_field(3) - details: str = betterproto.string_field(4) - operator_id: str = betterproto.string_field(5) - worker_id: str = betterproto.string_field(6) - - @dataclass(eq=False, repr=False) class ExecutionMetadataStore(betterproto.Message): state: "_architecture_rpc__.WorkflowAggregatedState" = betterproto.enum_field(1) - fatal_errors: List["WorkflowFatalError"] = betterproto.message_field(2) - execution_id: "ExecutionIdentity" = betterproto.message_field(3) + fatal_errors: List["__core__.WorkflowFatalError"] = betterproto.message_field(2) + execution_id: "__core__.ExecutionIdentity" = betterproto.message_field(3) is_recovering: bool = betterproto.bool_field(4) diff --git a/core/amber/src/main/python/proto/scalapb/__init__.py b/core/amber/src/main/python/proto/scalapb/__init__.py index 51a1655804e..49c713815a5 100644 --- a/core/amber/src/main/python/proto/scalapb/__init__.py +++ b/core/amber/src/main/python/proto/scalapb/__init__.py @@ -21,8 +21,7 @@ class MatchType(betterproto.Enum): class ScalaPbOptionsOptionsScope(betterproto.Enum): """ - Whether to apply the options only to this file, or for the entire package - (and its subpackages) + Whether to apply the options only to this file, or for the entire package (and its subpackages) """ FILE = 0 @@ -46,63 +45,63 @@ class ScalaPbOptions(betterproto.Message): flat_package: bool = betterproto.bool_field(2) """ - If true, the compiler does not append the proto base file name into the - generated package name. If false (the default), the generated scala package - name is the package_name.basename where basename is the proto file name - without the .proto extension. + If true, the compiler does not append the proto base file name + into the generated package name. If false (the default), the + generated scala package name is the package_name.basename where + basename is the proto file name without the .proto extension. """ import_: List[str] = betterproto.string_field(3) """ - Adds the following imports at the top of the file (this is meant to provide - implicit TypeMappers) + Adds the following imports at the top of the file (this is meant + to provide implicit TypeMappers) """ preamble: List[str] = betterproto.string_field(4) """ - Text to add to the generated scala file. This can be used only when - single_file is true. + Text to add to the generated scala file. This can be used only + when single_file is true. """ single_file: bool = betterproto.bool_field(5) """ - If true, all messages and enums (but not services) will be written to a - single Scala file. + If true, all messages and enums (but not services) will be written + to a single Scala file. """ no_primitive_wrappers: bool = betterproto.bool_field(7) """ - By default, wrappers defined at https://github.com/google/protobuf/blob/mas - ter/src/google/protobuf/wrappers.proto, are mapped to an Option[T] where T - is a primitive type. When this field is set to true, we do not perform this - transformation. + By default, wrappers defined at + https://github.com/google/protobuf/blob/master/src/google/protobuf/wrappers.proto, + are mapped to an Option[T] where T is a primitive type. When this field + is set to true, we do not perform this transformation. """ primitive_wrappers: bool = betterproto.bool_field(6) """ DEPRECATED. In ScalaPB <= 0.5.47, it was necessary to explicitly enable - primitive_wrappers. This field remains here for backwards compatibility, - but it has no effect on generated code. It is an error to set both - `primitive_wrappers` and `no_primitive_wrappers`. + primitive_wrappers. This field remains here for backwards compatibility, + but it has no effect on generated code. It is an error to set both + `primitive_wrappers` and `no_primitive_wrappers`. """ collection_type: str = betterproto.string_field(8) """ Scala type to be used for repeated fields. If unspecified, - `scala.collection.Seq` will be used. + `scala.collection.Seq` will be used. """ preserve_unknown_fields: bool = betterproto.bool_field(9) """ If set to true, all generated messages in this file will preserve unknown - fields. + fields. """ object_name: str = betterproto.string_field(10) """ - If defined, sets the name of the file-level object that would be generated. - This object extends `GeneratedFileObject` and contains descriptors, and - list of message and enum companions. + If defined, sets the name of the file-level object that would be generated. This + object extends `GeneratedFileObject` and contains descriptors, and list of message + and enum companions. """ scope: "ScalaPbOptionsOptionsScope" = betterproto.enum_field(11) @@ -114,15 +113,15 @@ class ScalaPbOptions(betterproto.Message): retain_source_code_info: bool = betterproto.bool_field(13) """ If true, then source-code info information will be included in the - generated code - normally the source code info is cleared out to reduce - code size. The source code info is useful for extracting source code - location from the descriptors as well as comments. + generated code - normally the source code info is cleared out to reduce + code size. The source code info is useful for extracting source code + location from the descriptors as well as comments. """ map_type: str = betterproto.string_field(14) """ Scala type to be used for maps. If unspecified, - `scala.collection.immutable.Map` will be used. + `scala.collection.immutable.Map` will be used. """ no_default_values_in_constructor: bool = betterproto.bool_field(15) @@ -133,8 +132,8 @@ class ScalaPbOptions(betterproto.Message): enum_value_naming: "ScalaPbOptionsEnumValueNaming" = betterproto.enum_field(16) enum_strip_prefix: bool = betterproto.bool_field(17) """ - Indicate if prefix (enum name + optional underscore) should be removed in - scala code Strip is applied before enum value naming changes. + Indicate if prefix (enum name + optional underscore) should be removed in scala code + Strip is applied before enum value naming changes. """ bytes_type: str = betterproto.string_field(21) @@ -169,9 +168,8 @@ class ScalaPbOptions(betterproto.Message): field_transformations: List["FieldTransformation"] = betterproto.message_field(25) ignore_all_transformations: bool = betterproto.bool_field(26) """ - Ignores all transformations for this file. This is meant to allow specific - files to opt out from transformations inherited through package-scoped - options. + Ignores all transformations for this file. This is meant to allow specific files to + opt out from transformations inherited through package-scoped options. """ getters: bool = betterproto.bool_field(27) @@ -179,17 +177,17 @@ class ScalaPbOptions(betterproto.Message): test_only_no_java_conversions: bool = betterproto.bool_field(999) """ - For use in tests only. Inhibit Java conversions even when when generator - parameters request for it. + For use in tests only. Inhibit Java conversions even when when generator parameters + request for it. """ @dataclass(eq=False, repr=False) class ScalaPbOptionsAuxMessageOptions(betterproto.Message): """ - AuxMessageOptions enables you to set message-level options through package- - scoped options. This is useful when you can't add a dependency on - scalapb.proto from the proto file that defines the message. + AuxMessageOptions enables you to set message-level options through package-scoped options. + This is useful when you can't add a dependency on scalapb.proto from the proto file that + defines the message. """ target: str = betterproto.string_field(1) @@ -197,17 +195,17 @@ class ScalaPbOptionsAuxMessageOptions(betterproto.Message): options: "MessageOptions" = betterproto.message_field(2) """ - Options to apply to the message. If there are any options defined on the - target message they take precedence over the options. + Options to apply to the message. If there are any options defined on the target message + they take precedence over the options. """ @dataclass(eq=False, repr=False) class ScalaPbOptionsAuxFieldOptions(betterproto.Message): """ - AuxFieldOptions enables you to set field-level options through package- - scoped options. This is useful when you can't add a dependency on - scalapb.proto from the proto file that defines the field. + AuxFieldOptions enables you to set field-level options through package-scoped options. + This is useful when you can't add a dependency on scalapb.proto from the proto file that + defines the field. """ target: str = betterproto.string_field(1) @@ -215,17 +213,17 @@ class ScalaPbOptionsAuxFieldOptions(betterproto.Message): options: "FieldOptions" = betterproto.message_field(2) """ - Options to apply to the field. If there are any options defined on the - target message they take precedence over the options. + Options to apply to the field. If there are any options defined on the target message + they take precedence over the options. """ @dataclass(eq=False, repr=False) class ScalaPbOptionsAuxEnumOptions(betterproto.Message): """ - AuxEnumOptions enables you to set enum-level options through package-scoped - options. This is useful when you can't add a dependency on scalapb.proto - from the proto file that defines the enum. + AuxEnumOptions enables you to set enum-level options through package-scoped options. + This is useful when you can't add a dependency on scalapb.proto from the proto file that + defines the enum. """ target: str = betterproto.string_field(1) @@ -233,17 +231,17 @@ class ScalaPbOptionsAuxEnumOptions(betterproto.Message): options: "EnumOptions" = betterproto.message_field(2) """ - Options to apply to the enum. If there are any options defined on the - target enum they take precedence over the options. + Options to apply to the enum. If there are any options defined on the target enum + they take precedence over the options. """ @dataclass(eq=False, repr=False) class ScalaPbOptionsAuxEnumValueOptions(betterproto.Message): """ - AuxEnumValueOptions enables you to set enum value level options through - package-scoped options. This is useful when you can't add a dependency on - scalapb.proto from the proto file that defines the enum. + AuxEnumValueOptions enables you to set enum value level options through package-scoped + options. This is useful when you can't add a dependency on scalapb.proto from the proto + file that defines the enum. """ target: str = betterproto.string_field(1) @@ -251,8 +249,8 @@ class ScalaPbOptionsAuxEnumValueOptions(betterproto.Message): options: "EnumValueOptions" = betterproto.message_field(2) """ - Options to apply to the enum value. If there are any options defined on the - target enum value they take precedence over the options. + Options to apply to the enum value. If there are any options defined on + the target enum value they take precedence over the options. """ @@ -269,8 +267,8 @@ class MessageOptions(betterproto.Message): type: str = betterproto.string_field(4) """ - All instances of this message will be converted to this type. An implicit - TypeMapper must be present. + All instances of this message will be converted to this type. An implicit TypeMapper + must be present. """ companion_annotations: List[str] = betterproto.string_field(5) @@ -280,30 +278,26 @@ class MessageOptions(betterproto.Message): sealed_oneof_extends: List[str] = betterproto.string_field(6) """ - Additional classes and traits to mix in to generated sealed_oneof base - trait. + Additional classes and traits to mix in to generated sealed_oneof base trait. """ no_box: bool = betterproto.bool_field(7) """ - If true, when this message is used as an optional field, do not wrap it in - an `Option`. This is equivalent of setting `(field).no_box` to true on each - field with the message type. + If true, when this message is used as an optional field, do not wrap it in an `Option`. + This is equivalent of setting `(field).no_box` to true on each field with the message type. """ unknown_fields_annotations: List[str] = betterproto.string_field(8) """ - Custom annotations to add to the generated `unknownFields` case class - field. + Custom annotations to add to the generated `unknownFields` case class field. """ @dataclass(eq=False, repr=False) class Collection(betterproto.Message): """ - Represents a custom Collection type in Scala. This allows ScalaPB to - integrate with collection types that are different enough from the ones in - the standard library. + Represents a custom Collection type in Scala. This allows ScalaPB to integrate with + collection types that are different enough from the ones in the standard library. """ type: str = betterproto.string_field(1) @@ -312,14 +306,14 @@ class Collection(betterproto.Message): non_empty: bool = betterproto.bool_field(2) """ Set to true if this collection type is not allowed to be empty, for example - cats.data.NonEmptyList. When true, ScalaPB will not generate `clearX` for - the repeated field and not provide a default argument in the constructor. + cats.data.NonEmptyList. When true, ScalaPB will not generate `clearX` for the repeated + field and not provide a default argument in the constructor. """ adapter: str = betterproto.string_field(3) """ - An Adapter is a Scala object available at runtime that provides certain - static methods that can operate on this collection type. + An Adapter is a Scala object available at runtime that provides certain static methods + that can operate on this collection type. """ @@ -329,16 +323,16 @@ class FieldOptions(betterproto.Message): scala_name: str = betterproto.string_field(2) collection_type: str = betterproto.string_field(3) """ - Can be specified only if this field is repeated. If unspecified, it falls - back to the file option named `collection_type`, which defaults to - `scala.collection.Seq`. + Can be specified only if this field is repeated. If unspecified, + it falls back to the file option named `collection_type`, which defaults + to `scala.collection.Seq`. """ collection: "Collection" = betterproto.message_field(8) key_type: str = betterproto.string_field(4) """ - If the field is a map, you can specify custom Scala types for the key or - value. + If the field is a map, you can specify custom Scala types for the key + or value. """ value_type: str = betterproto.string_field(5) @@ -347,22 +341,20 @@ class FieldOptions(betterproto.Message): map_type: str = betterproto.string_field(7) """ - Can be specified only if this field is a map. If unspecified, it falls back - to the file option named `map_type` which defaults to - `scala.collection.immutable.Map` + Can be specified only if this field is a map. If unspecified, + it falls back to the file option named `map_type` which defaults to + `scala.collection.immutable.Map` """ no_box: bool = betterproto.bool_field(30) """ - Do not box this value in Option[T]. If set, this overrides - MessageOptions.no_box + Do not box this value in Option[T]. If set, this overrides MessageOptions.no_box """ required: bool = betterproto.bool_field(31) """ - Like no_box it does not box a value in Option[T], but also fails parsing - when a value is not provided. This enables to emulate required fields in - proto3. + Like no_box it does not box a value in Option[T], but also fails parsing when a value + is not provided. This enables to emulate required fields in proto3. """ @@ -376,8 +368,8 @@ class EnumOptions(betterproto.Message): type: str = betterproto.string_field(3) """ - All instances of this enum will be converted to this type. An implicit - TypeMapper must be present. + All instances of this enum will be converted to this type. An implicit TypeMapper + must be present. """ base_annotations: List[str] = betterproto.string_field(4) diff --git a/core/scripts/python-proto-gen.sh b/core/scripts/python-proto-gen.sh index 21db91b5b0d..135be499521 100755 --- a/core/scripts/python-proto-gen.sh +++ b/core/scripts/python-proto-gen.sh @@ -4,7 +4,14 @@ TEXERA_ROOT="$(git rev-parse --show-toplevel)" AMBER_DIR="$TEXERA_ROOT/core/amber" PYAMBER_DIR="$AMBER_DIR/src/main/python" -PROTOBUF_DIR="$AMBER_DIR/src/main/protobuf" +PROTOBUF_AMBER_DIR="$AMBER_DIR/src/main/protobuf" + +CORE_DIR="$TEXERA_ROOT/core/workflow-core" +PROTOBUF_CORE_DIR="$CORE_DIR/src/main/protobuf" # proto-gen -protoc --python_betterproto_out="$PYAMBER_DIR/proto" -I="$PROTOBUF_DIR" $(find "$PROTOBUF_DIR" -iname "*.proto") --proto_path="$PROTOBUF_DIR" \ No newline at end of file +protoc --python_betterproto_out="$PYAMBER_DIR/proto" \ + -I="$PROTOBUF_AMBER_DIR" \ + -I="$PROTOBUF_CORE_DIR" \ + $(find "$PROTOBUF_AMBER_DIR" -iname "*.proto") \ + $(find "$PROTOBUF_CORE_DIR" -iname "*.proto") From 461789079f9d76037fbd8ed080dca3ada908a3ff Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Tue, 31 Dec 2024 01:22:32 -0800 Subject: [PATCH 22/47] Make OpExecInitInfo serializable (#3183) Previously, creating a physical operator during compilation also required the creation of its corresponding executor instances. To delay this process so that executor instances were created within workers, we used a lambda function (in `OpExecInitInfo`). However, the lambda approach had a critical limitation: it was not serializable and it is language dependent. This PR addresses this issue by replacing the lambda functions in `OpExecInitInfo` with fully serializable Protobuf entities. The serialized information now ensures compatibility with distributed environments and is language-independent. Two primary types of `OpExecInitInfo` are introduced: 1. **`OpExecWithClassName`**: - **Fields**: `className: String`, `descString: String`. - **Behavior**: The language compiler dynamically loads the class specified by `className` and uses `descString` as its initialization argument. 2. **`OpExecWithCode`**: - **Fields**: `code: String`, `language: String`. - **Behavior**: The language compiler compiles the provided `code` based on the specified `language`. The arguments are already pre-populated into the code string. ### Special Cases The `ProgressiveSink` and `CacheSource` executors are treated as special cases. These executors require additional unique information (e.g., `storageKey`, `workflowIdentity`, `outputMode`) to initialize their executor instances. While this PR preserves the handling of these special cases, these executors will eventually be refactored or removed as part of the plan to move storage management to the port layer. --- .../architecture/rpc/controlcommands.proto | 4 +- .../control/initialize_executor_handler.py | 6 +- .../architecture/managers/executor_manager.py | 1 - .../python/core/runnables/test_main_loop.py | 35 ++++---- .../proto/edu/uci/ics/amber/core/__init__.py | 39 ++++++++- .../amber/engine/architecture/rpc/__init__.py | 5 +- core/amber/src/main/resources/cluster.conf | 1 - .../pythonworker/PythonProxyClient.scala | 28 +----- .../RegionExecutionCoordinator.scala | 12 +-- .../managers/SerializationManager.scala | 31 +++---- .../InitializeExecutorHandler.scala | 29 ++++--- .../uci/ics/texera/workflow/LogicalPlan.scala | 2 +- .../texera/workflow/WorkflowCompiler.scala | 2 +- .../architecture/worker/WorkerSpec.scala | 35 ++++---- .../amber/compiler/model/LogicalPlan.scala | 7 +- .../edu/uci/ics/amber/core/executor.proto | 44 ++++++++++ .../ics/amber/core/executor/ExecFactory.scala | 44 ++++++++++ .../amber/core/executor/OpExecInitInfo.scala | 54 ------------ .../ics/amber/core/storage/FileResolver.scala | 18 ++++ .../ics/amber/core/workflow/PhysicalOp.scala | 42 +-------- .../operator/PythonOperatorDescriptor.scala | 10 +-- .../operator/SpecialPhysicalOpFactory.scala | 19 ++-- .../ics/amber/operator/TestOperators.scala | 4 +- .../operator/aggregate/AggregateOpDesc.scala | 72 ++++++++------- .../operator/aggregate/AggregateOpExec.scala | 18 ++-- .../CartesianProductOpDesc.scala | 9 +- .../dictionary/DictionaryMatcherOpDesc.scala | 8 +- .../dictionary/DictionaryMatcherOpExec.scala | 19 ++-- .../difference/DifferenceOpDesc.scala | 4 +- .../operator/distinct/DistinctOpDesc.scala | 4 +- .../filter/SpecializedFilterOpDesc.scala | 8 +- .../filter/SpecializedFilterOpExec.scala | 8 +- .../hashJoin/HashJoinBuildOpExec.scala | 8 +- .../operator/hashJoin/HashJoinOpDesc.scala | 16 ++-- .../hashJoin/HashJoinProbeOpExec.scala | 21 +++-- .../operator/intersect/IntersectOpDesc.scala | 4 +- .../intervalJoin/IntervalJoinOpDesc.scala | 15 ++-- .../intervalJoin/IntervalJoinOpExec.scala | 63 +++++++------- .../keywordSearch/KeywordSearchOpDesc.scala | 8 +- .../keywordSearch/KeywordSearchOpExec.scala | 12 ++- .../amber/operator/limit/LimitOpDesc.scala | 10 ++- .../amber/operator/limit/LimitOpExec.scala | 6 +- .../projection/ProjectionOpDesc.scala | 8 +- .../projection/ProjectionOpExec.scala | 13 +-- .../RandomKSamplingOpDesc.scala | 15 ++-- .../RandomKSamplingOpExec.scala | 11 ++- .../amber/operator/regex/RegexOpDesc.scala | 8 +- .../amber/operator/regex/RegexOpExec.scala | 9 +- .../ReservoirSamplingOpDesc.scala | 15 ++-- .../ReservoirSamplingOpExec.scala | 15 ++-- .../sentiment/SentimentAnalysisOpDesc.scala | 8 +- .../sentiment/SentimentAnalysisOpExec.java | 7 +- .../{managed => }/ProgressiveSinkOpExec.scala | 8 +- .../sortPartitions/SortPartitionsOpDesc.scala | 15 ++-- ...pExec.scala => SortPartitionsOpExec.scala} | 16 ++-- .../source/SourceOperatorDescriptor.scala | 5 +- .../apis/twitter/TwitterSourceOpExec.scala | 13 +-- ...TwitterFullArchiveSearchSourceOpDesc.scala | 17 ++-- ...TwitterFullArchiveSearchSourceOpExec.scala | 27 +++--- .../v2/TwitterSearchSourceOpDesc.scala | 15 ++-- .../v2/TwitterSearchSourceOpExec.scala | 20 ++--- .../source/fetcher/URLFetcherOpDesc.scala | 10 ++- .../source/fetcher/URLFetcherOpExec.scala | 11 +-- .../source/scan/FileScanSourceOpDesc.scala | 22 ++--- .../source/scan/FileScanSourceOpExec.scala | 35 ++++---- .../source/scan/ScanSourceOpDesc.scala | 18 ++-- .../source/scan/arrow/ArrowSourceOpDesc.scala | 20 ++--- .../source/scan/arrow/ArrowSourceOpExec.scala | 24 +++-- .../source/scan/csv/CSVScanSourceOpDesc.scala | 40 +++------ .../source/scan/csv/CSVScanSourceOpExec.scala | 32 +++---- .../csv/ParallelCSVScanSourceOpDesc.scala | 45 +++------- .../csv/ParallelCSVScanSourceOpExec.scala | 35 +++++--- .../scan/csvOld/CSVOldScanSourceOpDesc.scala | 39 +++------ .../scan/csvOld/CSVOldScanSourceOpExec.scala | 38 ++++---- .../scan/json/JSONLScanSourceOpDesc.scala | 48 +++------- .../scan/json/JSONLScanSourceOpExec.scala | 39 +++++---- .../scan/text/TextInputSourceOpDesc.scala | 8 +- .../scan/text/TextInputSourceOpExec.scala | 23 +++-- .../operator/source/sql/SQLSourceOpDesc.scala | 24 ++--- .../operator/source/sql/SQLSourceOpExec.scala | 78 ++++++++--------- .../sql/asterixdb/AsterixDBSourceOpDesc.scala | 51 +++-------- .../sql/asterixdb/AsterixDBSourceOpExec.scala | 87 +++++++------------ .../source/sql/mysql/MySQLSourceOpDesc.scala | 26 ++---- .../source/sql/mysql/MySQLSourceOpExec.scala | 50 +++-------- .../postgresql/PostgreSQLSourceOpDesc.scala | 26 ++---- .../postgresql/PostgreSQLSourceOpExec.scala | 48 +++------- .../amber/operator/split/SplitOpDesc.scala | 8 +- .../amber/operator/split/SplitOpExec.scala | 10 +-- .../SymmetricDifferenceOpDesc.scala | 6 +- .../typecasting/TypeCastingOpDesc.scala | 8 +- .../typecasting/TypeCastingOpExec.scala | 9 +- .../operator/udf/java/JavaUDFOpDesc.scala | 6 +- .../DualInputPortsPythonUDFOpDescV2.scala | 6 +- .../udf/python/PythonUDFOpDescV2.scala | 6 +- .../source/PythonUDFSourceOpDescV2.scala | 5 +- .../ics/amber/operator/udf/r/RUDFOpDesc.scala | 6 +- .../operator/udf/r/RUDFSourceOpDesc.scala | 10 ++- .../amber/operator/union/UnionOpDesc.scala | 4 +- .../unneststring/UnnestStringOpDesc.scala | 8 +- .../unneststring/UnnestStringOpExec.scala | 10 ++- .../visualization/htmlviz/HtmlVizOpDesc.scala | 8 +- .../visualization/htmlviz/HtmlVizOpExec.scala | 6 +- .../visualization/urlviz/UrlVizOpDesc.scala | 10 ++- .../visualization/urlviz/UrlVizOpExec.scala | 7 +- .../DictionaryMatcherOpExecSpec.scala | 37 ++++---- .../filter/SpecializedFilterOpExecSpec.scala | 50 ++++------- .../operator/hashJoin/HashJoinOpSpec.scala | 27 +++--- .../intervalJoin/IntervalOpExecSpec.scala | 26 ++---- .../KeywordSearchOpExecSpec.scala | 62 +++++++++---- .../projection/ProjectionOpExecSpec.scala | 73 +++++++--------- .../SortPartitionsOpExecSpec.scala | 13 ++- .../source/fetcher/URLFetcherOpExecSpec.scala | 11 ++- .../scan/csv/CSVScanSourceOpDescSpec.scala | 22 ++--- .../scan/text/FileScanSourceOpDescSpec.scala | 74 +++++----------- .../scan/text/TextInputSourceOpDescSpec.scala | 19 +++- .../typecasting/TypeCastingOpExecSpec.scala | 8 +- .../unneststring/UnnestStringOpExecSpec.scala | 18 ++-- .../htmlviz/HtmlVizOpExecSpec.scala | 12 +-- 118 files changed, 1161 insertions(+), 1331 deletions(-) create mode 100644 core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/executor.proto create mode 100644 core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/ExecFactory.scala delete mode 100644 core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/OpExecInitInfo.scala rename core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/{managed => }/ProgressiveSinkOpExec.scala (89%) rename core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/{SortPartitionOpExec.scala => SortPartitionsOpExec.scala} (77%) diff --git a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto index 81f7f4b21ba..5df9e7ab47b 100644 --- a/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto +++ b/core/amber/src/main/protobuf/edu/uci/ics/amber/engine/architecture/rpc/controlcommands.proto @@ -3,6 +3,7 @@ package edu.uci.ics.amber.engine.architecture.rpc; import "edu/uci/ics/amber/core/virtualidentity.proto"; import "edu/uci/ics/amber/core/workflow.proto"; +import "edu/uci/ics/amber/core/executor.proto"; import "edu/uci/ics/amber/engine/architecture/worker/statistics.proto"; import "edu/uci/ics/amber/engine/architecture/sendsemantics/partitionings.proto"; import "scalapb/scalapb.proto"; @@ -235,9 +236,8 @@ message FinalizeCheckpointRequest { message InitializeExecutorRequest { int32 totalWorkerCount = 1; - google.protobuf.Any opExecInitInfo = 2 [(scalapb.field).no_box = true]; + core.OpExecInitInfo opExecInitInfo = 2; bool isSource = 3; - string language = 4; } message UpdateExecutorRequest { diff --git a/core/amber/src/main/python/core/architecture/handlers/control/initialize_executor_handler.py b/core/amber/src/main/python/core/architecture/handlers/control/initialize_executor_handler.py index fbff3b09418..9ae770fca13 100644 --- a/core/amber/src/main/python/core/architecture/handlers/control/initialize_executor_handler.py +++ b/core/amber/src/main/python/core/architecture/handlers/control/initialize_executor_handler.py @@ -1,4 +1,6 @@ from core.architecture.handlers.control.control_handler_base import ControlHandler +from core.util import get_one_of +from proto.edu.uci.ics.amber.core import OpExecWithCode from proto.edu.uci.ics.amber.engine.architecture.rpc import ( EmptyReturn, InitializeExecutorRequest, @@ -8,8 +10,8 @@ class InitializeExecutorHandler(ControlHandler): async def initialize_executor(self, req: InitializeExecutorRequest) -> EmptyReturn: - code = req.op_exec_init_info.value.decode("utf-8") + op_exec_with_code: OpExecWithCode = get_one_of(req.op_exec_init_info) self.context.executor_manager.initialize_executor( - code, req.is_source, req.language + op_exec_with_code.code, req.is_source, op_exec_with_code.language ) return EmptyReturn() diff --git a/core/amber/src/main/python/core/architecture/managers/executor_manager.py b/core/amber/src/main/python/core/architecture/managers/executor_manager.py index 0ab6fd9c33e..238e6c3f9a1 100644 --- a/core/amber/src/main/python/core/architecture/managers/executor_manager.py +++ b/core/amber/src/main/python/core/architecture/managers/executor_manager.py @@ -114,7 +114,6 @@ def initialize_executor(self, code: str, is_source: bool, language: str) -> None class declaration. :param is_source: Indicating if the operator is used as a source operator. :param language: The language of the operator code. - :param output_schema: the raw mapping of output schema, name -> type_str. :return: """ if language == "r-tuple": diff --git a/core/amber/src/main/python/core/runnables/test_main_loop.py b/core/amber/src/main/python/core/runnables/test_main_loop.py index 77981fade9b..910149b06a1 100644 --- a/core/amber/src/main/python/core/runnables/test_main_loop.py +++ b/core/amber/src/main/python/core/runnables/test_main_loop.py @@ -1,10 +1,10 @@ import inspect +import pickle from threading import Thread import pandas import pyarrow import pytest -import pickle from core.models import ( DataFrame, @@ -16,6 +16,16 @@ from core.models.marker import EndOfInputChannel from core.runnables import MainLoop from core.util import set_one_of +from proto.edu.uci.ics.amber.core import ( + ActorVirtualIdentity, + PhysicalLink, + PhysicalOpIdentity, + OperatorIdentity, + ChannelIdentity, + PortIdentity, + OpExecWithCode, + OpExecInitInfo, +) from proto.edu.uci.ics.amber.engine.architecture.rpc import ( ControlRequest, AssignPortRequest, @@ -42,18 +52,9 @@ WorkerStatistics, PortTupleCountMapping, ) -from proto.edu.uci.ics.amber.core import ( - ActorVirtualIdentity, - PhysicalLink, - PhysicalOpIdentity, - OperatorIdentity, - ChannelIdentity, - PortIdentity, -) from proto.edu.uci.ics.amber.engine.common import ControlPayloadV2 from pytexera.udf.examples.count_batch_operator import CountBatchOperator from pytexera.udf.examples.echo_operator import EchoOperator -from google.protobuf.any_pb2 import Any as ProtoAny class TestMainLoop: @@ -270,13 +271,14 @@ def mock_initialize_executor( command_sequence, mock_raw_schema, ): - proto_any = ProtoAny() + operator_code = "from pytexera import *\n" + inspect.getsource(EchoOperator) - proto_any.value = operator_code.encode("utf-8") command = set_one_of( ControlRequest, InitializeExecutorRequest( - op_exec_init_info=proto_any, + op_exec_init_info=set_one_of( + OpExecInitInfo, OpExecWithCode(operator_code, "python") + ), is_source=False, ), ) @@ -299,15 +301,16 @@ def mock_initialize_batch_count_executor( command_sequence, mock_raw_schema, ): - proto_any = ProtoAny() + operator_code = "from pytexera import *\n" + inspect.getsource( CountBatchOperator ) - proto_any.value = operator_code.encode("utf-8") command = set_one_of( ControlRequest, InitializeExecutorRequest( - op_exec_init_info=proto_any, + op_exec_init_info=set_one_of( + OpExecInitInfo, OpExecWithCode(operator_code, "python") + ), is_source=False, ), ) diff --git a/core/amber/src/main/python/proto/edu/uci/ics/amber/core/__init__.py b/core/amber/src/main/python/proto/edu/uci/ics/amber/core/__init__.py index 0cb9940da1d..31c15ddbc9b 100644 --- a/core/amber/src/main/python/proto/edu/uci/ics/amber/core/__init__.py +++ b/core/amber/src/main/python/proto/edu/uci/ics/amber/core/__init__.py @@ -1,5 +1,5 @@ # Generated by the protocol buffer compiler. DO NOT EDIT! -# sources: edu/uci/ics/amber/core/virtualidentity.proto, edu/uci/ics/amber/core/workflow.proto, edu/uci/ics/amber/core/workflowruntimestate.proto +# sources: edu/uci/ics/amber/core/executor.proto, edu/uci/ics/amber/core/virtualidentity.proto, edu/uci/ics/amber/core/workflow.proto, edu/uci/ics/amber/core/workflowruntimestate.proto # plugin: python-betterproto # This file has been @generated @@ -98,6 +98,43 @@ class PhysicalLink(betterproto.Message): to_port_id: "PortIdentity" = betterproto.message_field(4) +@dataclass(eq=False, repr=False) +class OpExecWithCode(betterproto.Message): + code: str = betterproto.string_field(1) + language: str = betterproto.string_field(2) + + +@dataclass(eq=False, repr=False) +class OpExecWithClassName(betterproto.Message): + class_name: str = betterproto.string_field(1) + desc_string: str = betterproto.string_field(2) + + +@dataclass(eq=False, repr=False) +class OpExecSink(betterproto.Message): + storage_key: str = betterproto.string_field(1) + workflow_identity: "WorkflowIdentity" = betterproto.message_field(2) + output_mode: "OutputPortOutputMode" = betterproto.enum_field(3) + + +@dataclass(eq=False, repr=False) +class OpExecSource(betterproto.Message): + storage_key: str = betterproto.string_field(1) + workflow_identity: "WorkflowIdentity" = betterproto.message_field(2) + + +@dataclass(eq=False, repr=False) +class OpExecInitInfo(betterproto.Message): + op_exec_with_class_name: "OpExecWithClassName" = betterproto.message_field( + 1, group="sealed_value" + ) + op_exec_with_code: "OpExecWithCode" = betterproto.message_field( + 2, group="sealed_value" + ) + op_exec_sink: "OpExecSink" = betterproto.message_field(3, group="sealed_value") + op_exec_source: "OpExecSource" = betterproto.message_field(4, group="sealed_value") + + @dataclass(eq=False, repr=False) class WorkflowFatalError(betterproto.Message): type: "FatalErrorType" = betterproto.enum_field(1) diff --git a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/rpc/__init__.py b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/rpc/__init__.py index 676292d9605..9320b54e36e 100644 --- a/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/rpc/__init__.py +++ b/core/amber/src/main/python/proto/edu/uci/ics/amber/engine/architecture/rpc/__init__.py @@ -355,11 +355,8 @@ class FinalizeCheckpointRequest(betterproto.Message): @dataclass(eq=False, repr=False) class InitializeExecutorRequest(betterproto.Message): total_worker_count: int = betterproto.int32_field(1) - op_exec_init_info: "betterproto_lib_google_protobuf.Any" = ( - betterproto.message_field(2) - ) + op_exec_init_info: "___core__.OpExecInitInfo" = betterproto.message_field(2) is_source: bool = betterproto.bool_field(3) - language: str = betterproto.string_field(4) @dataclass(eq=False, repr=False) diff --git a/core/amber/src/main/resources/cluster.conf b/core/amber/src/main/resources/cluster.conf index 67e0e847a97..f3ae050244c 100644 --- a/core/amber/src/main/resources/cluster.conf +++ b/core/amber/src/main/resources/cluster.conf @@ -27,7 +27,6 @@ akka { serialization-bindings { "java.io.Serializable" = kryo "java.lang.Throwable" = akka-misc - "edu.uci.ics.amber.core.executor.OpExecInitInfo" = kryo } } diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala index 05cb7b0758a..c7dc6400c1e 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala @@ -1,27 +1,21 @@ package edu.uci.ics.amber.engine.architecture.pythonworker -import com.google.protobuf.ByteString -import com.google.protobuf.any.Any import com.twitter.util.{Await, Promise} import edu.uci.ics.amber.core.WorkflowRuntimeException -import edu.uci.ics.amber.core.executor.{OpExecInitInfo, OpExecInitInfoWithCode} import edu.uci.ics.amber.core.marker.State import edu.uci.ics.amber.core.tuple.{Schema, Tuple} +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import edu.uci.ics.amber.engine.architecture.pythonworker.WorkerBatchInternalQueue.{ ActorCommandElement, ControlElement, DataElement } -import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.{ - ControlInvocation, - InitializeExecutorRequest -} +import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.ControlInvocation import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.ReturnInvocation +import edu.uci.ics.amber.engine.common.AmberLogging import edu.uci.ics.amber.engine.common.actormessage.{ActorCommand, PythonActorMessage} import edu.uci.ics.amber.engine.common.ambermessage._ -import edu.uci.ics.amber.engine.common.{AmberLogging, AmberRuntime} import edu.uci.ics.amber.util.ArrowUtils -import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import org.apache.arrow.flight._ import org.apache.arrow.memory.{ArrowBuf, BufferAllocator, RootAllocator} import org.apache.arrow.vector.VectorSchemaRoot @@ -120,21 +114,7 @@ class PythonProxyClient(portNumberPromise: Promise[Int], val actorId: ActorVirtu var payloadV2 = ControlPayloadV2.defaultInstance payloadV2 = payload match { case c: ControlInvocation => - val req = c.command match { - case InitializeExecutorRequest(worker, info, isSource, _) => - val bytes = info.value.toByteArray - val opExecInitInfo: OpExecInitInfo = - AmberRuntime.serde.deserialize(bytes, classOf[OpExecInitInfo]).get - val (code, language) = opExecInitInfo.asInstanceOf[OpExecInitInfoWithCode].codeGen(0, 0) - InitializeExecutorRequest( - worker, - Any.of("", ByteString.copyFrom(code, "UTF-8")), - isSource, - language - ) - case other => other - } - payloadV2.withControlInvocation(c.withCommand(req)) + payloadV2.withControlInvocation(c) case r: ReturnInvocation => payloadV2.withReturnInvocation(r) case _ => ??? diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionExecutionCoordinator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionExecutionCoordinator.scala index 8567e0f17bb..3d58c3635e6 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionExecutionCoordinator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/RegionExecutionCoordinator.scala @@ -1,7 +1,5 @@ package edu.uci.ics.amber.engine.architecture.scheduling -import com.google.protobuf.ByteString -import com.google.protobuf.any.Any import com.twitter.util.Future import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.engine.architecture.common.{AkkaActorService, ExecutorDeployment} @@ -25,7 +23,6 @@ import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.{ WorkflowAggregatedState } import edu.uci.ics.amber.engine.architecture.scheduling.config.{OperatorConfig, ResourceConfig} -import edu.uci.ics.amber.engine.common.AmberRuntime import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER import edu.uci.ics.amber.core.workflow.PhysicalLink @@ -131,16 +128,11 @@ class RegionExecutionCoordinator( .flatMap(physicalOp => { val workerConfigs = resourceConfig.operatorConfigs(physicalOp.id).workerConfigs workerConfigs.map(_.workerId).map { workerId => - val bytes = AmberRuntime.serde.serialize(physicalOp.opExecInitInfo).get asyncRPCClient.workerInterface.initializeExecutor( InitializeExecutorRequest( workerConfigs.length, - Any.of( - "edu.uci.ics.amber.engine.architecture.deploysemantics.layer.OpExecInitInfo", - ByteString.copyFrom(bytes) - ), - physicalOp.isSourceOperator, - "scala" + physicalOp.opExecInitInfo, + physicalOp.isSourceOperator ), asyncRPCClient.mkContext(workerId) ) diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/SerializationManager.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/SerializationManager.scala index a6ea4f74e4b..5a1015e708d 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/SerializationManager.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/managers/SerializationManager.scala @@ -1,18 +1,12 @@ package edu.uci.ics.amber.engine.architecture.worker.managers -import edu.uci.ics.amber.core.executor.OpExecInitInfo.generateJavaOpExec -import edu.uci.ics.amber.core.executor.{OpExecInitInfo, OperatorExecutor} +import edu.uci.ics.amber.core.executor._ import edu.uci.ics.amber.core.tuple.TupleLike -import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.InitializeExecutorRequest -import edu.uci.ics.amber.engine.common.{ - AmberLogging, - AmberRuntime, - CheckpointState, - CheckpointSupport -} -import edu.uci.ics.amber.util.VirtualIdentityUtils import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity import edu.uci.ics.amber.core.workflow.PortIdentity +import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.InitializeExecutorRequest +import edu.uci.ics.amber.engine.common.{AmberLogging, CheckpointState, CheckpointSupport} +import edu.uci.ics.amber.util.VirtualIdentityUtils class SerializationManager(val actorId: ActorVirtualIdentity) extends AmberLogging { @@ -26,14 +20,15 @@ class SerializationManager(val actorId: ActorVirtualIdentity) extends AmberLoggi def restoreExecutorState( chkpt: CheckpointState ): (OperatorExecutor, Iterator[(TupleLike, Option[PortIdentity])]) = { - val opExecInitInfo: OpExecInitInfo = AmberRuntime.serde - .deserialize(execInitMsg.opExecInitInfo.value.toByteArray, classOf[OpExecInitInfo]) - .get - val executor = generateJavaOpExec( - opExecInitInfo, - VirtualIdentityUtils.getWorkerIndex(actorId), - execInitMsg.totalWorkerCount - ) + val workerIdx = VirtualIdentityUtils.getWorkerIndex(actorId) + val workerCount = execInitMsg.totalWorkerCount + val executor = execInitMsg.opExecInitInfo match { + case OpExecWithClassName(className, descString) => + ExecFactory.newExecFromJavaClassName(className, descString, workerIdx, workerCount) + case OpExecWithCode(code, language) => ExecFactory.newExecFromJavaCode(code) + case _ => throw new UnsupportedOperationException("Unsupported OpExec type") + } + val iter = executor match { case support: CheckpointSupport => support.deserializeState(chkpt) diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/InitializeExecutorHandler.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/InitializeExecutorHandler.scala index a27e0ba614d..c727f422da9 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/InitializeExecutorHandler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/worker/promisehandlers/InitializeExecutorHandler.scala @@ -1,15 +1,15 @@ package edu.uci.ics.amber.engine.architecture.worker.promisehandlers import com.twitter.util.Future -import edu.uci.ics.amber.core.executor.OpExecInitInfo -import edu.uci.ics.amber.core.executor.OpExecInitInfo.generateJavaOpExec +import edu.uci.ics.amber.core.executor._ import edu.uci.ics.amber.engine.architecture.rpc.controlcommands.{ AsyncRPCContext, InitializeExecutorRequest } import edu.uci.ics.amber.engine.architecture.rpc.controlreturns.EmptyReturn import edu.uci.ics.amber.engine.architecture.worker.DataProcessorRPCHandlerInitializer -import edu.uci.ics.amber.engine.common.AmberRuntime +import edu.uci.ics.amber.operator.sink.ProgressiveSinkOpExec +import edu.uci.ics.amber.operator.source.cache.CacheSourceOpExec import edu.uci.ics.amber.util.VirtualIdentityUtils trait InitializeExecutorHandler { @@ -20,14 +20,21 @@ trait InitializeExecutorHandler { ctx: AsyncRPCContext ): Future[EmptyReturn] = { dp.serializationManager.setOpInitialization(req) - val bytes = req.opExecInitInfo.value.toByteArray - val opExecInitInfo: OpExecInitInfo = - AmberRuntime.serde.deserialize(bytes, classOf[OpExecInitInfo]).get - dp.executor = generateJavaOpExec( - opExecInitInfo, - VirtualIdentityUtils.getWorkerIndex(actorId), - req.totalWorkerCount - ) + val workerIdx = VirtualIdentityUtils.getWorkerIndex(actorId) + val workerCount = req.totalWorkerCount + dp.executor = req.opExecInitInfo match { + case OpExecWithClassName(className, descString) => + ExecFactory.newExecFromJavaClassName(className, descString, workerIdx, workerCount) + case OpExecWithCode(code, _) => ExecFactory.newExecFromJavaCode(code) + case OpExecSink(storageKey, workflowIdentity, outputMode) => + new ProgressiveSinkOpExec( + outputMode, + storageKey, + workflowIdentity + ) + case OpExecSource(storageKey, workflowIdentity) => + new CacheSourceOpExec(storageKey, workflowIdentity) + } EmptyReturn() } diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala index a76b4b589af..ca52eb9708c 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/LogicalPlan.scala @@ -84,7 +84,7 @@ case class LogicalPlan( val fileUri = FileResolver.resolve(fileName) // Convert to URI // Set the URI in the ScanSourceOpDesc - scanOp.setFileUri(fileUri) + scanOp.setResolvedFileName(fileUri) } match { case Success(_) => // Successfully resolved and set the file URI diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala index 6c13eadbffe..a01a1d3b38b 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/workflow/WorkflowCompiler.scala @@ -8,7 +8,7 @@ import edu.uci.ics.amber.engine.common.Utils.objectMapper import edu.uci.ics.amber.operator.SpecialPhysicalOpFactory import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode.SINGLE_SNAPSHOT -import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.workflow.PhysicalLink import edu.uci.ics.texera.web.model.websocket.request.LogicalPlanPojo import edu.uci.ics.texera.web.service.ExecutionsMetadataPersistService diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala index aab2548242a..4fc14016ac1 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala @@ -3,10 +3,16 @@ package edu.uci.ics.amber.engine.architecture.worker import akka.actor.{ActorRef, ActorSystem, Props} import akka.serialization.SerializationExtension import akka.testkit.{ImplicitSender, TestActorRef, TestKit} -import com.google.protobuf.ByteString -import com.google.protobuf.any.{Any => ProtoAny} import edu.uci.ics.amber.clustering.SingleNodeListener -import edu.uci.ics.amber.core.executor.{OpExecInitInfo, OperatorExecutor} +import edu.uci.ics.amber.core.executor.{OpExecWithClassName, OperatorExecutor} +import edu.uci.ics.amber.core.tuple._ +import edu.uci.ics.amber.core.virtualidentity.{ + ActorVirtualIdentity, + ChannelIdentity, + OperatorIdentity, + PhysicalOpIdentity +} +import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import edu.uci.ics.amber.engine.architecture.common.WorkflowActor.NetworkMessage import edu.uci.ics.amber.engine.architecture.rpc.controlcommands._ import edu.uci.ics.amber.engine.architecture.rpc.workerservice.WorkerServiceGrpc._ @@ -20,13 +26,6 @@ import edu.uci.ics.amber.engine.common.AmberRuntime import edu.uci.ics.amber.engine.common.ambermessage.{DataFrame, DataPayload, WorkflowFIFOMessage} import edu.uci.ics.amber.engine.common.rpc.AsyncRPCClient import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER -import edu.uci.ics.amber.core.virtualidentity.{ - ActorVirtualIdentity, - ChannelIdentity, - OperatorIdentity, - PhysicalOpIdentity -} -import edu.uci.ics.amber.core.workflow.{PhysicalLink, PortIdentity} import org.scalamock.scalatest.MockFactory import org.scalatest.BeforeAndAfterAll import org.scalatest.flatspec.AnyFlatSpecLike @@ -35,7 +34,6 @@ import java.util.concurrent.CompletableFuture import scala.collection.mutable import scala.concurrent.duration.MILLISECONDS import scala.util.Random -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema, Tuple, TupleLike} class DummyOperatorExecutor extends OperatorExecutor { override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = { Iterator(tuple) @@ -168,17 +166,14 @@ class WorkerSpec AsyncRPCContext(CONTROLLER, identifier1), 3 ) - val opInit = OpExecInitInfo((_, _) => { - new DummyOperatorExecutor() - }) - val bytes = AmberRuntime.serde.serialize(opInit).get - val protoAny = ProtoAny.of( - "edu.uci.ics.amber.engine.architecture.deploysemantics.layer.OpExecInitInfo", - ByteString.copyFrom(bytes) - ) + val initializeOperatorLogic = AsyncRPCClient.ControlInvocation( METHOD_INITIALIZE_EXECUTOR, - InitializeExecutorRequest(1, protoAny, isSource = false, "scala"), + InitializeExecutorRequest( + 1, + OpExecWithClassName("edu.uci.ics.amber.engine.architecture.worker.DummyOperatorExecutor"), + isSource = false + ), AsyncRPCContext(CONTROLLER, identifier1), 4 ) diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala index ea79ba5ceb7..db700e8a3df 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/model/LogicalPlan.scala @@ -2,10 +2,7 @@ package edu.uci.ics.amber.compiler.model import com.typesafe.scalalogging.LazyLogging import edu.uci.ics.amber.core.storage.FileResolver -import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.WorkflowContext import edu.uci.ics.amber.operator.LogicalOp -import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc import edu.uci.ics.amber.core.virtualidentity.OperatorIdentity import edu.uci.ics.amber.core.workflow.PortIdentity @@ -14,8 +11,6 @@ import org.jgrapht.util.SupplierUtil import java.util import scala.collection.mutable.ArrayBuffer -import scala.collection.mutable -import scala.jdk.CollectionConverters.SetHasAsScala import scala.util.{Failure, Success, Try} object LogicalPlan { @@ -107,7 +102,7 @@ case class LogicalPlan( val fileUri = FileResolver.resolve(fileName) // Convert to URI // Set the URI in the ScanSourceOpDesc - scanOp.setFileUri(fileUri) + scanOp.setResolvedFileName(fileUri) } match { case Success(_) => // Successfully resolved and set the file URI case Failure(err) => diff --git a/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/executor.proto b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/executor.proto new file mode 100644 index 00000000000..fdc98b2b34f --- /dev/null +++ b/core/workflow-core/src/main/protobuf/edu/uci/ics/amber/core/executor.proto @@ -0,0 +1,44 @@ +syntax = "proto3"; +package edu.uci.ics.amber.core; + + +import "edu/uci/ics/amber/core/virtualidentity.proto"; +import "edu/uci/ics/amber/core/workflow.proto"; +import "scalapb/scalapb.proto"; + +option (scalapb.options) = { + scope: FILE, + preserve_unknown_fields: false + no_default_values_in_constructor: false +}; + + +message OpExecWithCode { + string code = 1; + string language = 2; +} + +message OpExecWithClassName { + string className = 1; + string descString = 2; +} + +message OpExecSink { + string storageKey = 1; + WorkflowIdentity workflowIdentity = 2 [(scalapb.field).no_box = true]; + OutputPort.OutputMode outputMode = 3 [(scalapb.field).no_box = true]; +} + +message OpExecSource { + string storageKey = 1; + WorkflowIdentity workflowIdentity = 2 [(scalapb.field).no_box = true]; +} + +message OpExecInitInfo { + oneof sealed_value { + OpExecWithClassName opExecWithClassName = 1; + OpExecWithCode opExecWithCode = 2; + OpExecSink opExecSink = 3; + OpExecSource opExecSource = 4; + } +} \ No newline at end of file diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/ExecFactory.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/ExecFactory.scala new file mode 100644 index 00000000000..8e53e2ca4ff --- /dev/null +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/ExecFactory.scala @@ -0,0 +1,44 @@ +package edu.uci.ics.amber.core.executor + +object ExecFactory { + + def newExecFromJavaCode(code: String): OperatorExecutor = { + JavaRuntimeCompilation + .compileCode(code) + .getDeclaredConstructor() + .newInstance() + .asInstanceOf[OperatorExecutor] + } + + def newExecFromJavaClassName[K]( + className: String, + descString: String = "", + idx: Int = 0, + workerCount: Int = 1 + ): OperatorExecutor = { + val clazz = Class.forName(className).asInstanceOf[Class[K]] + try { + if (descString.isEmpty) { + clazz.getDeclaredConstructor().newInstance().asInstanceOf[OperatorExecutor] + } else { + clazz + .getDeclaredConstructor(classOf[String]) + .newInstance(descString) + .asInstanceOf[OperatorExecutor] + } + } catch { + case e: NoSuchMethodException => + if (descString.isEmpty) { + clazz + .getDeclaredConstructor(classOf[Int], classOf[Int]) + .newInstance(idx, workerCount) + .asInstanceOf[OperatorExecutor] + } else { + clazz + .getDeclaredConstructor(classOf[String], classOf[Int], classOf[Int]) + .newInstance(descString, idx, workerCount) + .asInstanceOf[OperatorExecutor] + } + } + } +} diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/OpExecInitInfo.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/OpExecInitInfo.scala deleted file mode 100644 index 2e315e6296a..00000000000 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/executor/OpExecInitInfo.scala +++ /dev/null @@ -1,54 +0,0 @@ -package edu.uci.ics.amber.core.executor - -object OpExecInitInfo { - - type OpExecFunc = (Int, Int) => OperatorExecutor - type JavaOpExecFunc = - java.util.function.Function[(Int, Int), OperatorExecutor] with java.io.Serializable - - def generateJavaOpExec( - opExecInitInfo: OpExecInitInfo, - workerIdx: Int, - numWorkers: Int - ): OperatorExecutor = { - opExecInitInfo match { - case OpExecInitInfoWithCode(codeGen) => - val (code, _) = - codeGen(workerIdx, numWorkers) - JavaRuntimeCompilation - .compileCode(code) - .getDeclaredConstructor() - .newInstance() - .asInstanceOf[OperatorExecutor] - case OpExecInitInfoWithFunc(opGen) => - opGen( - workerIdx, - numWorkers - ) - } - } - - def apply(code: String, language: String): OpExecInitInfo = - OpExecInitInfoWithCode((_, _) => (code, language)) - def apply(opExecFunc: OpExecFunc): OpExecInitInfo = OpExecInitInfoWithFunc(opExecFunc) - def apply(opExecFunc: JavaOpExecFunc): OpExecInitInfo = - OpExecInitInfoWithFunc((idx, totalWorkerCount) => opExecFunc.apply(idx, totalWorkerCount)) -} - -/** - * Information regarding initializing an operator executor instance - * it could be two cases: - * - OpExecInitInfoWithFunc: - * A function to create an operator executor instance, with parameters: - * 1) the worker index, 2) the PhysicalOp; - * - OpExecInitInfoWithCode: - * A function returning the code string that to be compiled in a virtual machine. - */ -sealed trait OpExecInitInfo - -final case class OpExecInitInfoWithCode( - codeGen: (Int, Int) => (String, String) -) extends OpExecInitInfo -final case class OpExecInitInfoWithFunc( - opGen: (Int, Int) => OperatorExecutor -) extends OpExecInitInfo diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/FileResolver.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/FileResolver.scala index b2c2fafb0ba..c90707a77a8 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/FileResolver.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/storage/FileResolver.scala @@ -25,6 +25,9 @@ object FileResolver { * @return Either[String, DatasetFileDocument] - the resolved path as a String or a DatasetFileDocument */ def resolve(fileName: String): URI = { + if (isFileResolved(fileName)) { + return new URI(fileName) + } val resolvers: Seq[String => URI] = Seq(localResolveFunc, datasetResolveFunc) // Try each resolver function in sequence @@ -131,4 +134,19 @@ object FileResolver { throw new FileNotFoundException(s"Dataset file $fileName not found.") } } + + /** + * Checks if a given file path has a valid scheme. + * + * @param filePath The file path to check. + * @return `true` if the file path contains a valid scheme, `false` otherwise. + */ + def isFileResolved(filePath: String): Boolean = { + try { + val uri = new URI(filePath) + uri.getScheme != null && uri.getScheme.nonEmpty + } catch { + case _: Exception => false // Invalid URI format + } + } } diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala index daf7cd679f9..d493f0891a5 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.core.workflow import com.fasterxml.jackson.annotation.{JsonIgnore, JsonIgnoreProperties} import com.typesafe.scalalogging.LazyLogging -import edu.uci.ics.amber.core.executor.{OpExecInitInfo, OpExecInitInfoWithCode} +import edu.uci.ics.amber.core.executor.{OpExecWithCode, OpExecInitInfo} import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.virtualidentity.{ ExecutionIdentity, @@ -10,7 +10,6 @@ import edu.uci.ics.amber.core.virtualidentity.{ PhysicalOpIdentity, WorkflowIdentity } -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalLink, PortIdentity} import org.jgrapht.graph.{DefaultEdge, DirectedAcyclicGraph} import org.jgrapht.traverse.TopologicalOrderIterator @@ -84,7 +83,7 @@ object PhysicalOp { executionId: ExecutionIdentity, opExecInitInfo: OpExecInitInfo ): PhysicalOp = - PhysicalOp(physicalOpId, workflowId, executionId, opExecInitInfo = opExecInitInfo) + PhysicalOp(physicalOpId, workflowId, executionId, opExecInitInfo) def manyToOnePhysicalOp( workflowId: WorkflowIdentity, @@ -138,31 +137,6 @@ object PhysicalOp { manyToOnePhysicalOp(physicalOpId, workflowId, executionId, opExecInitInfo) .withLocationPreference(Some(PreferController)) } - - def getExternalPortSchemas( - physicalOp: PhysicalOp, - fromInput: Boolean, - errorList: Option[ArrayBuffer[(OperatorIdentity, Throwable)]] - ): List[Option[Schema]] = { - - // Select either input ports or output ports and filter out the internal ports - val ports = if (fromInput) { - physicalOp.inputPorts.values.filterNot { case (port, _, _) => port.id.internal } - } else { - physicalOp.outputPorts.values.filterNot { case (port, _, _) => port.id.internal } - } - - ports.map { - case (_, _, schema) => - schema match { - case Left(err) => - errorList.foreach(errList => errList.append((physicalOp.id.logicalOpId, err))) - None - case Right(validSchema) => - Some(validSchema) - } - }.toList - } } // @JsonIgnore is not working when directly annotated to fields of a case class @@ -218,8 +192,6 @@ case class PhysicalOp( .toList .distinct - private lazy val isInitWithCode: Boolean = opExecInitInfo.isInstanceOf[OpExecInitInfoWithCode] - /** * Helper functions related to compile-time operations */ @@ -239,20 +211,12 @@ case class PhysicalOp( @JsonIgnore // this is needed to prevent the serialization issue def isPythonBased: Boolean = { opExecInitInfo match { - case opExecInfo: OpExecInitInfoWithCode => - val (_, language) = opExecInfo.codeGen(0, 0) + case OpExecWithCode(_, language) => language == "python" || language == "r-tuple" || language == "r-table" case _ => false } } - @JsonIgnore // this is needed to prevent the serialization issue - def getPythonCode: String = { - val (code, _) = - opExecInitInfo.asInstanceOf[OpExecInitInfoWithCode].codeGen(0, 0) - code - } - /** * creates a copy with the location preference information */ diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala index c5cc4fd152f..941db76f9d5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala @@ -1,29 +1,27 @@ package edu.uci.ics.amber.operator -import edu.uci.ics.amber.core.executor.OpExecInitInfoWithCode -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.core.executor.OpExecWithCode import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} trait PythonOperatorDescriptor extends LogicalOp { override def getPhysicalOp( workflowId: WorkflowIdentity, executionId: ExecutionIdentity ): PhysicalOp = { - val opExecInitInfo = OpExecInitInfoWithCode((_, _) => (generatePythonCode(), "python")) - val physicalOp = if (asSource()) { PhysicalOp.sourcePhysicalOp( workflowId, executionId, operatorIdentifier, - opExecInitInfo + OpExecWithCode(generatePythonCode(), "python") ) } else { PhysicalOp.oneToOnePhysicalOp( workflowId, executionId, operatorIdentifier, - opExecInitInfo + OpExecWithCode(generatePythonCode(), "python") ) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala index 96776d36b62..e60040eb467 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala @@ -1,12 +1,8 @@ package edu.uci.ics.amber.operator -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.{OpExecSink, OpExecSource} import edu.uci.ics.amber.core.storage.result.{OpResultStorage, ResultStorage} import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} -import edu.uci.ics.amber.operator.sink.ProgressiveUtils -import edu.uci.ics.amber.operator.sink.managed.ProgressiveSinkOpExec -import edu.uci.ics.amber.operator.source.cache.CacheSourceOpExec import edu.uci.ics.amber.core.virtualidentity.{ ExecutionIdentity, PhysicalOpIdentity, @@ -18,7 +14,8 @@ import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode.{ SET_SNAPSHOT, SINGLE_SNAPSHOT } -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow._ +import edu.uci.ics.amber.operator.sink.ProgressiveUtils object SpecialPhysicalOpFactory { def newSinkPhysicalOp( @@ -33,13 +30,7 @@ object SpecialPhysicalOpFactory { PhysicalOpIdentity(opId, s"sink${portId.id}"), workflowIdentity, executionIdentity, - OpExecInitInfo((idx, workers) => - new ProgressiveSinkOpExec( - outputMode, - storageKey, - workflowIdentity - ) - ) + OpExecSink(storageKey, workflowIdentity, outputMode) ) .withInputPorts(List(InputPort(PortIdentity(internal = true)))) .withOutputPorts(List(OutputPort(PortIdentity(internal = true)))) @@ -90,7 +81,7 @@ object SpecialPhysicalOpFactory { PhysicalOpIdentity(opId, s"source${portId.id}"), workflowIdentity, executionIdentity, - OpExecInitInfo((_, _) => new CacheSourceOpExec(storageKey, workflowIdentity)) + OpExecSource(storageKey, workflowIdentity) ) .withInputPorts(List.empty) .withOutputPorts(List(outputPort)) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/TestOperators.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/TestOperators.scala index bf03e272577..7f428ab7967 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/TestOperators.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/TestOperators.scala @@ -67,7 +67,7 @@ object TestOperators { csvHeaderlessOp.fileName = Some(fileName) csvHeaderlessOp.customDelimiter = Some(",") csvHeaderlessOp.hasHeader = header - csvHeaderlessOp.setFileUri(FileResolver.resolve(fileName)) + csvHeaderlessOp.setResolvedFileName(FileResolver.resolve(fileName)) csvHeaderlessOp } @@ -76,7 +76,7 @@ object TestOperators { val jsonlOp = new JSONLScanSourceOpDesc jsonlOp.fileName = Some(fileName) jsonlOp.flatten = flatten - jsonlOp.setFileUri(FileResolver.resolve(fileName)) + jsonlOp.setResolvedFileName(FileResolver.resolve(fileName)) jsonlOp } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala index b27c9cac387..14c138562f4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala @@ -2,23 +2,18 @@ package edu.uci.ics.amber.operator.aggregate import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{ - HashPartition, - PhysicalOp, - PhysicalPlan, - SchemaPropagationFunc -} -import edu.uci.ics.amber.operator.LogicalOp -import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeNameList -import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.virtualidentity.{ ExecutionIdentity, PhysicalOpIdentity, WorkflowIdentity } -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.workflow._ +import edu.uci.ics.amber.operator.LogicalOp +import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeNameList +import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import javax.validation.constraints.{NotNull, Size} @@ -42,49 +37,52 @@ class AggregateOpDesc extends LogicalOp { // TODO: this is supposed to be blocking but due to limitations of materialization naming on the logical operator // we are keeping it not annotated as blocking. + val inputPort = InputPort(PortIdentity()) val outputPort = OutputPort(PortIdentity(internal = true)) - val partialPhysicalOp = - PhysicalOp - .oneToOnePhysicalOp( - PhysicalOpIdentity(operatorIdentifier, "localAgg"), - workflowId, - executionId, - OpExecInitInfo((_, _) => new AggregateOpExec(aggregations, groupByKeys)) - ) - .withIsOneToManyOp(true) - .withInputPorts(List(InputPort(PortIdentity()))) - .withOutputPorts(List(outputPort)) - .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - PortIdentity(internal = true) -> getOutputSchema( - operatorInfo.inputPorts.map(port => inputSchemas(port.id)).toArray - ) + val partialDesc = objectMapper.writeValueAsString(this) + val localAggregations = List(aggregations: _*) + val partialPhysicalOp = PhysicalOp + .oneToOnePhysicalOp( + PhysicalOpIdentity(operatorIdentifier, "localAgg"), + workflowId, + executionId, + OpExecWithClassName("edu.uci.ics.amber.operator.aggregate.AggregateOpExec", partialDesc) + ) + .withIsOneToManyOp(true) + .withInputPorts(List(inputPort)) + .withOutputPorts(List(outputPort)) + .withPropagateSchema( + SchemaPropagationFunc(inputSchemas => { + aggregations = localAggregations + Map( + PortIdentity(internal = true) -> getOutputSchema( + operatorInfo.inputPorts.map(port => inputSchemas(port.id)).toArray ) ) - ) - - val inputPort = InputPort(PortIdentity(0, internal = true)) + }) + ) + val finalInputPort = InputPort(PortIdentity(0, internal = true)) val finalOutputPort = OutputPort(PortIdentity(0), blocking = true) + // change aggregations to final + aggregations = aggregations.map(aggr => aggr.getFinal) + val finalDesc = objectMapper.writeValueAsString(this) val finalPhysicalOp = PhysicalOp .oneToOnePhysicalOp( PhysicalOpIdentity(operatorIdentifier, "globalAgg"), workflowId, executionId, - OpExecInitInfo((_, _) => - new AggregateOpExec(aggregations.map(aggr => aggr.getFinal), groupByKeys) - ) + OpExecWithClassName("edu.uci.ics.amber.operator.aggregate.AggregateOpExec", finalDesc) ) .withParallelizable(false) .withIsOneToManyOp(true) - .withInputPorts(List(inputPort)) + .withInputPorts(List(finalInputPort)) .withOutputPorts(List(finalOutputPort)) .withPropagateSchema( SchemaPropagationFunc(inputSchemas => Map(operatorInfo.outputPorts.head.id -> { - inputSchemas(PortIdentity(internal = true)) + inputSchemas(finalInputPort.id) }) ) ) @@ -94,7 +92,7 @@ class AggregateOpDesc extends LogicalOp { var plan = PhysicalPlan( operators = Set(partialPhysicalOp, finalPhysicalOp), links = Set( - PhysicalLink(partialPhysicalOp.id, outputPort.id, finalPhysicalOp.id, inputPort.id) + PhysicalLink(partialPhysicalOp.id, outputPort.id, finalPhysicalOp.id, finalInputPort.id) ) ) plan.operators.foreach(op => plan = plan.setOperator(op.withIsOneToManyOp(true))) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpExec.scala index d3b3d82022e..147803c6fca 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpExec.scala @@ -2,20 +2,15 @@ package edu.uci.ics.amber.operator.aggregate import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import scala.collection.mutable /** * AggregateOpExec performs aggregation operations on input tuples, optionally grouping them by specified keys. - * - * @param aggregations a list of aggregation operations to apply on the tuples - * @param groupByKeys a list of attribute names to group the tuples by */ -class AggregateOpExec( - aggregations: List[AggregationOperation], - groupByKeys: List[String] -) extends OperatorExecutor { - +class AggregateOpExec(descString: String) extends OperatorExecutor { + private val desc: AggregateOpDesc = objectMapper.readValue(descString, classOf[AggregateOpDesc]) private val keyedPartialAggregates = new mutable.HashMap[List[Object], List[Object]]() private var distributedAggregations: List[DistributedAggregation[Object]] = _ @@ -23,12 +18,13 @@ class AggregateOpExec( // Initialize distributedAggregations if it's not yet initialized if (distributedAggregations == null) { - distributedAggregations = - aggregations.map(agg => agg.getAggFunc(tuple.getSchema.getAttribute(agg.attribute).getType)) + distributedAggregations = desc.aggregations.map(agg => + agg.getAggFunc(tuple.getSchema.getAttribute(agg.attribute).getType) + ) } // Construct the group key - val key = groupByKeys.map(tuple.getField[Object]) + val key = desc.groupByKeys.map(tuple.getField[Object]) // Get or initialize the partial aggregate for the key val partialAggregates = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala index 2be0c18e598..7e71d29b42b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala @@ -1,12 +1,11 @@ package edu.uci.ics.amber.operator.cartesianProduct -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, Schema} -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow._ import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class CartesianProductOpDesc extends LogicalOp { override def getPhysicalOp( @@ -18,7 +17,7 @@ class CartesianProductOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new CartesianProductOpExec()) + OpExecWithClassName("edu.uci.ics.amber.operator.cartesianProduct.CartesianProductOpExec") ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala index 6921cccfba7..4a2cb463355 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala @@ -2,12 +2,13 @@ package edu.uci.ics.amber.operator.dictionary import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -39,7 +40,10 @@ class DictionaryMatcherOpDesc extends MapOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new DictionaryMatcherOpExec(attribute, dictionary, matchingType)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.dictionary.DictionaryMatcherOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExec.scala index 8233e8087a7..7a811537d2d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExec.scala @@ -2,19 +2,20 @@ package edu.uci.ics.amber.operator.dictionary import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} import edu.uci.ics.amber.operator.map.MapOpExec +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.apache.lucene.analysis.Analyzer import org.apache.lucene.analysis.en.EnglishAnalyzer import org.apache.lucene.analysis.tokenattributes.CharTermAttribute + import java.io.StringReader import scala.collection.mutable import scala.collection.mutable.ListBuffer class DictionaryMatcherOpExec( - attributeName: String, - dictionary: String, - matchingType: MatchingType + descString: String ) extends MapOpExec { - + private val desc: DictionaryMatcherOpDesc = + objectMapper.readValue(descString, classOf[DictionaryMatcherOpDesc]) // this is needed for the matching types Phrase and Conjunction var tokenizedDictionaryEntries: ListBuffer[mutable.Set[String]] = _ // this is needed for the simple Scan matching type @@ -40,8 +41,8 @@ class DictionaryMatcherOpExec( */ override def open(): Unit = { // create the dictionary by splitting the values first - dictionaryEntries = dictionary.split(",").toList.map(_.toLowerCase) - if (matchingType == MatchingType.CONJUNCTION_INDEXBASED) { + dictionaryEntries = desc.dictionary.split(",").toList.map(_.toLowerCase) + if (desc.matchingType == MatchingType.CONJUNCTION_INDEXBASED) { // then tokenize each entry this.luceneAnalyzer = new EnglishAnalyzer tokenizedDictionaryEntries = ListBuffer[mutable.Set[String]]() @@ -72,12 +73,12 @@ class DictionaryMatcherOpExec( * @return true if the tuple matches a dictionary entry according to the matching criteria; false otherwise. */ private def isTupleInDictionary(tuple: Tuple): Boolean = { - val text = tuple.getField(attributeName).asInstanceOf[String].toLowerCase + val text = tuple.getField(desc.attribute).asInstanceOf[String].toLowerCase // Return false if the text is empty, as it cannot match any dictionary entry if (text.isEmpty) return false - matchingType match { + desc.matchingType match { case MatchingType.SCANBASED => // Directly check if the dictionary contains the text dictionaryEntries.contains(text) @@ -130,7 +131,7 @@ class DictionaryMatcherOpExec( */ private def labelTupleIfMatched(tuple: Tuple): TupleLike = { val isMatched = - Option(tuple.getField[Any](attributeName)).exists(_ => isTupleInDictionary(tuple)) + Option(tuple.getField[Any](desc.attribute)).exists(_ => isTupleInDictionary(tuple)) TupleLike(tuple.getFields ++ Seq(isMatched)) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala index a8c25ad2363..8c144b3756a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.operator.difference import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} import edu.uci.ics.amber.operator.LogicalOp @@ -20,7 +20,7 @@ class DifferenceOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new DifferenceOpExec()) + OpExecWithClassName("edu.uci.ics.amber.operator.difference.DifferenceOpExec") ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala index ae00eb38c10..30c2f9f4b27 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.operator.distinct import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} import edu.uci.ics.amber.operator.LogicalOp @@ -20,7 +20,7 @@ class DistinctOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new DistinctOpExec()) + OpExecWithClassName("edu.uci.ics.amber.operator.distinct.DistinctOpExec") ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala index 61b87009377..340e2ee8b48 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpDesc.scala @@ -1,9 +1,10 @@ package edu.uci.ics.amber.operator.filter import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -22,7 +23,10 @@ class SpecializedFilterOpDesc extends FilterOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new SpecializedFilterOpExec(predicates)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.filter.SpecializedFilterOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.scala index 096721decac..88b83282153 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExec.scala @@ -1,8 +1,10 @@ package edu.uci.ics.amber.operator.filter import edu.uci.ics.amber.core.tuple.Tuple +import edu.uci.ics.amber.util.JSONUtils.objectMapper -class SpecializedFilterOpExec(predicates: List[FilterPredicate]) extends FilterOpExec { - - setFilterFunc((tuple: Tuple) => predicates.exists(_.evaluate(tuple))) +class SpecializedFilterOpExec(descString: String) extends FilterOpExec { + private val desc: SpecializedFilterOpDesc = + objectMapper.readValue(descString, classOf[SpecializedFilterOpDesc]) + setFilterFunc((tuple: Tuple) => desc.predicates.exists(_.evaluate(tuple))) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinBuildOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinBuildOpExec.scala index 08633de0c62..ced8d06de0f 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinBuildOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinBuildOpExec.scala @@ -2,17 +2,19 @@ package edu.uci.ics.amber.operator.hashJoin import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import scala.collection.mutable import scala.collection.mutable.ListBuffer -class HashJoinBuildOpExec[K](buildAttributeName: String) extends OperatorExecutor { - +class HashJoinBuildOpExec[K](descString: String) extends OperatorExecutor { + private val desc: HashJoinOpDesc[K] = + objectMapper.readValue(descString, classOf[HashJoinOpDesc[K]]) var buildTableHashMap: mutable.HashMap[K, ListBuffer[Tuple]] = _ override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = { - val key = tuple.getField(buildAttributeName).asInstanceOf[K] + val key = tuple.getField(desc.buildAttributeName).asInstanceOf[K] buildTableHashMap.getOrElseUpdate(key, new ListBuffer[Tuple]()) += tuple Iterator() } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala index fe009a91989..3777b2b6216 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.hashJoin import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow._ import edu.uci.ics.amber.operator.LogicalOp @@ -18,6 +18,7 @@ import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeNameOnPort1 } import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper object HashJoinOpDesc { val HASH_JOIN_INTERNAL_KEY_NAME = "__internal__hashtable__key__" @@ -67,7 +68,10 @@ class HashJoinOpDesc[K] extends LogicalOp { PhysicalOpIdentity(operatorIdentifier, "build"), workflowId, executionId, - OpExecInitInfo((_, _) => new HashJoinBuildOpExec[K](buildAttributeName)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.hashJoin.HashJoinBuildOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(List(buildInputPort)) .withOutputPorts(List(buildOutputPort)) @@ -96,11 +100,9 @@ class HashJoinOpDesc[K] extends LogicalOp { PhysicalOpIdentity(operatorIdentifier, "probe"), workflowId, executionId, - OpExecInitInfo((_, _) => - new HashJoinProbeOpExec[K]( - probeAttributeName, - joinType - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.hashJoin.HashJoinProbeOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinProbeOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinProbeOpExec.scala index 38d46367a4f..4483c09dc25 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinProbeOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinProbeOpExec.scala @@ -3,6 +3,7 @@ package edu.uci.ics.amber.operator.hashJoin import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} import edu.uci.ics.amber.operator.hashJoin.HashJoinOpDesc.HASH_JOIN_INTERNAL_KEY_NAME +import edu.uci.ics.amber.util.JSONUtils.objectMapper import scala.collection.mutable import scala.collection.mutable.ListBuffer @@ -41,11 +42,11 @@ object JoinUtils { } class HashJoinProbeOpExec[K]( - probeAttributeName: String, - joinType: JoinType + descString: String ) extends OperatorExecutor { - var currentTuple: Tuple = _ + private val desc: HashJoinOpDesc[K] = + objectMapper.readValue(descString, classOf[HashJoinOpDesc[K]]) var buildTableHashMap: mutable.HashMap[K, (ListBuffer[Tuple], Boolean)] = _ override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = @@ -59,7 +60,7 @@ class HashJoinProbeOpExec[K]( Iterator.empty } else { // Probe phase - val key = tuple.getField(probeAttributeName).asInstanceOf[K] + val key = tuple.getField(desc.probeAttributeName).asInstanceOf[K] val (matchedTuples, joined) = buildTableHashMap.getOrElse(key, (new ListBuffer[Tuple](), false)) @@ -67,7 +68,7 @@ class HashJoinProbeOpExec[K]( // Join match found buildTableHashMap.put(key, (matchedTuples, true)) performJoin(tuple, matchedTuples) - } else if (joinType == JoinType.RIGHT_OUTER || joinType == JoinType.FULL_OUTER) { + } else if (desc.joinType == JoinType.RIGHT_OUTER || desc.joinType == JoinType.FULL_OUTER) { // Handle right and full outer joins without a match performRightAntiJoin(tuple) } else { @@ -77,7 +78,9 @@ class HashJoinProbeOpExec[K]( } override def onFinish(port: Int): Iterator[TupleLike] = { - if (port == 1 && (joinType == JoinType.LEFT_OUTER || joinType == JoinType.FULL_OUTER)) { + if ( + port == 1 && (desc.joinType == JoinType.LEFT_OUTER || desc.joinType == JoinType.FULL_OUTER) + ) { // Handle left and full outer joins after input is exhausted performLeftAntiJoin } else { @@ -104,7 +107,11 @@ class HashJoinProbeOpExec[K]( matchedTuples: ListBuffer[Tuple] ): Iterator[TupleLike] = { matchedTuples.iterator.map { buildTuple => - JoinUtils.joinTuples(buildTuple, probeTuple, skipAttributeName = Some(probeAttributeName)) + JoinUtils.joinTuples( + buildTuple, + probeTuple, + skipAttributeName = Some(desc.probeAttributeName) + ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala index 8fc2e999ee7..1de8534ac11 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.operator.intersect import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} import edu.uci.ics.amber.operator.LogicalOp @@ -20,7 +20,7 @@ class IntersectOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new IntersectOpExec()) + OpExecWithClassName("edu.uci.ics.amber.operator.intersect.IntersectOpExec") ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala index d27792c044d..985b8f9c4d6 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.intervalJoin import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.fasterxml.jackson.databind.annotation.JsonDeserialize import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp @@ -12,6 +12,7 @@ import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, AutofillAttributeNameOnPort1 } +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} @@ -82,15 +83,9 @@ class IntervalJoinOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => - new IntervalJoinOpExec( - leftAttributeName, - rightAttributeName, - includeLeftBound, - includeRightBound, - constant, - timeIntervalType - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.intervalJoin.IntervalJoinOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpExec.scala index 2999ebc482a..3abc4fce00b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpExec.scala @@ -4,6 +4,7 @@ import edu.uci.ics.amber.core.WorkflowRuntimeException import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{AttributeType, Tuple, TupleLike} import edu.uci.ics.amber.operator.hashJoin.JoinUtils +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.sql.Timestamp import scala.collection.mutable.ListBuffer @@ -13,14 +14,10 @@ import scala.collection.mutable.ListBuffer * 2. The left input join key takes as points, join condition is: left key in the range of (right key, right key + constant) */ class IntervalJoinOpExec( - leftAttributeName: String, - rightAttributeName: String, - includeLeftBound: Boolean, - includeRightBound: Boolean, - constant: Long, - timeIntervalType: Option[TimeIntervalType] + descString: String ) extends OperatorExecutor { - + private val desc: IntervalJoinOpDesc = + objectMapper.readValue(descString, classOf[IntervalJoinOpDesc]) var leftTable: ListBuffer[Tuple] = new ListBuffer[Tuple]() var rightTable: ListBuffer[Tuple] = new ListBuffer[Tuple]() @@ -34,10 +31,10 @@ class IntervalJoinOpExec( .filter(rightTableTuple => { intervalCompare( - tuple.getField(leftAttributeName), - rightTableTuple.getField(rightAttributeName), + tuple.getField(desc.leftAttributeName), + rightTableTuple.getField(desc.rightAttributeName), rightTableTuple.getSchema - .getAttribute(rightAttributeName) + .getAttribute(desc.rightAttributeName) .getType ) == 0 }) @@ -53,10 +50,10 @@ class IntervalJoinOpExec( leftTable .filter(leftTableTuple => { intervalCompare( - leftTableTuple.getField(leftAttributeName), - tuple.getField(rightAttributeName), + leftTableTuple.getField(desc.leftAttributeName), + tuple.getField(desc.rightAttributeName), leftTableTuple.getSchema - .getAttribute(leftAttributeName) + .getAttribute(desc.leftAttributeName) .getType ) == 0 }) @@ -74,10 +71,10 @@ class IntervalJoinOpExec( while (rightTable.nonEmpty) { if ( intervalCompare( - leftTableSmallestTuple.getField(leftAttributeName), - rightTable.head.getField(rightAttributeName), + leftTableSmallestTuple.getField(desc.leftAttributeName), + rightTable.head.getField(desc.rightAttributeName), leftTableSmallestTuple.getSchema - .getAttribute(leftAttributeName) + .getAttribute(desc.leftAttributeName) .getType ) > 0 ) { @@ -94,10 +91,10 @@ class IntervalJoinOpExec( while (leftTable.nonEmpty) { if ( intervalCompare( - leftTable.head.getField(leftAttributeName), - rightTableSmallestTuple.getField(rightAttributeName), + leftTable.head.getField(desc.leftAttributeName), + rightTableSmallestTuple.getField(desc.rightAttributeName), rightTableSmallestTuple.getSchema - .getAttribute(rightAttributeName) + .getAttribute(desc.rightAttributeName) .getType ) < 0 ) { @@ -114,15 +111,15 @@ class IntervalJoinOpExec( leftBoundValue: T, rightBoundValue: T )(implicit ev$1: T => Ordered[T]): Int = { - if (includeLeftBound && includeRightBound) { + if (desc.includeLeftBound && desc.includeRightBound) { if (pointValue >= leftBoundValue && pointValue <= rightBoundValue) 0 else if (pointValue < leftBoundValue) -1 else 1 - } else if (includeLeftBound && !includeRightBound) { + } else if (desc.includeLeftBound && !desc.includeRightBound) { if (pointValue >= leftBoundValue && pointValue < rightBoundValue) 0 else if (pointValue < leftBoundValue) -1 else 1 - } else if (!includeLeftBound && includeRightBound) { + } else if (!desc.includeLeftBound && desc.includeRightBound) { if (pointValue > leftBoundValue && pointValue <= rightBoundValue) 0 else if (pointValue <= leftBoundValue) -1 else 1 @@ -142,7 +139,7 @@ class IntervalJoinOpExec( if (dataType == AttributeType.LONG) { val pointValue: Long = point.asInstanceOf[Long] val leftBoundValue: Long = leftBound.asInstanceOf[Long] - val constantValue: Long = constant + val constantValue: Long = desc.constant val rightBoundValue: Long = leftBoundValue + constantValue result = processNumValue[Long]( pointValue, @@ -153,7 +150,7 @@ class IntervalJoinOpExec( } else if (dataType == AttributeType.DOUBLE) { val pointValue: Double = point.asInstanceOf[Double] val leftBoundValue: Double = leftBound.asInstanceOf[Double] - val constantValue: Double = constant.asInstanceOf[Double] + val constantValue: Double = desc.constant.asInstanceOf[Double] val rightBoundValue: Double = leftBoundValue + constantValue result = processNumValue[Double]( pointValue, @@ -163,7 +160,7 @@ class IntervalJoinOpExec( } else if (dataType == AttributeType.INTEGER) { val pointValue: Int = point.asInstanceOf[Int] val leftBoundValue: Int = leftBound.asInstanceOf[Int] - val constantValue: Int = constant.asInstanceOf[Int] + val constantValue: Int = desc.constant.asInstanceOf[Int] val rightBoundValue: Int = leftBoundValue + constantValue result = processNumValue[Int]( pointValue, @@ -174,21 +171,21 @@ class IntervalJoinOpExec( val pointValue: Timestamp = point.asInstanceOf[Timestamp] val leftBoundValue: Timestamp = leftBound.asInstanceOf[Timestamp] val rightBoundValue: Timestamp = - timeIntervalType match { + desc.timeIntervalType match { case Some(TimeIntervalType.YEAR) => - Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusYears(constant)) + Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusYears(desc.constant)) case Some(TimeIntervalType.MONTH) => - Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusMonths(constant)) + Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusMonths(desc.constant)) case Some(TimeIntervalType.DAY) => - Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusDays(constant)) + Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusDays(desc.constant)) case Some(TimeIntervalType.HOUR) => - Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusHours(constant)) + Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusHours(desc.constant)) case Some(TimeIntervalType.MINUTE) => - Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusMinutes(constant)) + Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusMinutes(desc.constant)) case Some(TimeIntervalType.SECOND) => - Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusSeconds(constant)) + Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusSeconds(desc.constant)) case None => - Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusDays(constant)) + Timestamp.valueOf(leftBoundValue.toLocalDateTime.plusDays(desc.constant)) } result = processNumValue( pointValue.getTime, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpDesc.scala index cf478610d56..b3e41f267e6 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpDesc.scala @@ -2,11 +2,12 @@ package edu.uci.ics.amber.operator.keywordSearch import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.filter.FilterOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -32,7 +33,10 @@ class KeywordSearchOpDesc extends FilterOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new KeywordSearchOpExec(attribute, keyword)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.keywordSearch.KeywordSearchOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExec.scala index 4275a682521..1154ed18a28 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExec.scala @@ -2,23 +2,27 @@ package edu.uci.ics.amber.operator.keywordSearch import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.operator.filter.FilterOpExec +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.apache.lucene.analysis.standard.StandardAnalyzer import org.apache.lucene.index.memory.MemoryIndex import org.apache.lucene.queryparser.classic.QueryParser import org.apache.lucene.search.Query -class KeywordSearchOpExec(attributeName: String, keyword: String) extends FilterOpExec { +class KeywordSearchOpExec(descString: String) extends FilterOpExec { + private val desc: KeywordSearchOpDesc = + objectMapper.readValue(descString, classOf[KeywordSearchOpDesc]) + // We chose StandardAnalyzer because it provides more comprehensive tokenization, retaining numeric tokens and handling a broader range of characters. // This ensures that search functionality can include standalone numbers (e.g., "3") and complex queries while offering robust performance for most use cases. @transient private lazy val analyzer = new StandardAnalyzer() - @transient lazy val query: Query = new QueryParser(attributeName, analyzer).parse(keyword) + @transient lazy val query: Query = new QueryParser(desc.attribute, analyzer).parse(desc.keyword) @transient private lazy val memoryIndex: MemoryIndex = new MemoryIndex() this.setFilterFunc(findKeyword) private def findKeyword(tuple: Tuple): Boolean = { - Option[Any](tuple.getField(attributeName)).map(_.toString).exists { fieldValue => - memoryIndex.addField(attributeName, fieldValue, analyzer) + Option[Any](tuple.getField(desc.attribute)).map(_.toString).exists { fieldValue => + memoryIndex.addField(desc.attribute, fieldValue, analyzer) val isMatch = memoryIndex.search(query) > 0.0f memoryIndex.reset() isMatch diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala index b3cf15a0e40..70ebe4725f4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala @@ -2,11 +2,12 @@ package edu.uci.ics.amber.operator.limit import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.{LogicalOp, StateTransferFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -28,9 +29,10 @@ class LimitOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => { - new LimitOpExec(limit) - }) + OpExecWithClassName( + "edu.uci.ics.amber.operator.limit.LimitOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpExec.scala index b3d74a81f56..f396f0acbee 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpExec.scala @@ -2,13 +2,15 @@ package edu.uci.ics.amber.operator.limit import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} +import edu.uci.ics.amber.util.JSONUtils.objectMapper -class LimitOpExec(limit: Int) extends OperatorExecutor { +class LimitOpExec(descString: String) extends OperatorExecutor { + private val desc: LimitOpDesc = objectMapper.readValue(descString, classOf[LimitOpDesc]) var count = 0 override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = { - if (count < limit) { + if (count < desc.limit) { count += 1 Iterator(tuple) } else { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala index 2bd6fc413fd..47b80cfaef0 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala @@ -3,12 +3,13 @@ package edu.uci.ics.amber.operator.projection import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.PhysicalOp.oneToOnePhysicalOp import edu.uci.ics.amber.core.workflow._ import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -29,7 +30,10 @@ class ProjectionOpDesc extends MapOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new ProjectionOpExec(attributes, isDrop)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.projection.ProjectionOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExec.scala index 75458594c8d..888c4b4c976 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExec.scala @@ -3,23 +3,24 @@ package edu.uci.ics.amber.operator.projection import com.google.common.base.Preconditions import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} import edu.uci.ics.amber.operator.map.MapOpExec +import edu.uci.ics.amber.util.JSONUtils.objectMapper import scala.collection.mutable class ProjectionOpExec( - attributeUnits: List[AttributeUnit], - dropOption: Boolean = false + descString: String ) extends MapOpExec { + val desc: ProjectionOpDesc = objectMapper.readValue(descString, classOf[ProjectionOpDesc]) setMapFunc(project) def project(tuple: Tuple): TupleLike = { - Preconditions.checkArgument(attributeUnits.nonEmpty) + Preconditions.checkArgument(desc.attributes.nonEmpty) var selectedUnits: List[AttributeUnit] = List() val fields = mutable.LinkedHashMap[String, Any]() - if (dropOption) { + if (desc.isDrop) { val allAttribute = tuple.schema.getAttributeNames - val selectedAttributes = attributeUnits.map(_.getOriginalAttribute) + val selectedAttributes = desc.attributes.map(_.getOriginalAttribute) val keepAttributes = allAttribute.diff(selectedAttributes) keepAttributes.foreach { attribute => @@ -31,7 +32,7 @@ class ProjectionOpExec( } else { - selectedUnits = attributeUnits + selectedUnits = desc.attributes } selectedUnits.foreach { attributeUnit => diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpDesc.scala index 7fa187e70ee..3fd4849c27c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpDesc.scala @@ -1,14 +1,12 @@ package edu.uci.ics.amber.operator.randomksampling import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import edu.uci.ics.amber.core.executor.OpExecInitInfo -import edu.uci.ics.amber.core.workflow.PhysicalOp +import edu.uci.ics.amber.core.executor.OpExecWithClassName +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalOp} import edu.uci.ics.amber.operator.filter.FilterOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} - -import scala.util.Random +import edu.uci.ics.amber.util.JSONUtils.objectMapper class RandomKSamplingOpDesc extends FilterOpDesc { @@ -25,8 +23,9 @@ class RandomKSamplingOpDesc extends FilterOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((idx, workerCount) => - new RandomKSamplingOpExec(percentage, idx, Array.fill(workerCount)(Random.nextInt())) + OpExecWithClassName( + "edu.uci.ics.amber.operator.randomksampling.RandomKSamplingOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpExec.scala index f6028b6c5c5..74767f46d00 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/randomksampling/RandomKSamplingOpExec.scala @@ -1,11 +1,14 @@ package edu.uci.ics.amber.operator.randomksampling import edu.uci.ics.amber.operator.filter.FilterOpExec +import edu.uci.ics.amber.util.JSONUtils.objectMapper import scala.util.Random -class RandomKSamplingOpExec(percentage: Int, worker: Int, seedFunc: Int => Int) - extends FilterOpExec { - val rand: Random = new Random(seedFunc(worker)) - setFilterFunc(_ => (percentage / 100.0) >= rand.nextDouble()) +class RandomKSamplingOpExec(descString: String, idx: Int, workerCount: Int) extends FilterOpExec { + private val desc: RandomKSamplingOpDesc = + objectMapper.readValue(descString, classOf[RandomKSamplingOpDesc]) + + val rand: Random = new Random(workerCount) + setFilterFunc(_ => (desc.percentage / 100.0) >= rand.nextDouble()) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpDesc.scala index 6d06c839943..070417540fb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpDesc.scala @@ -2,11 +2,12 @@ package edu.uci.ics.amber.operator.regex import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.filter.FilterOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -35,7 +36,10 @@ class RegexOpDesc extends FilterOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new RegexOpExec(regex, caseInsensitive, attribute)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.regex.RegexOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpExec.scala index 49a5aafbcfa..0c2b72a402b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/regex/RegexOpExec.scala @@ -2,17 +2,18 @@ package edu.uci.ics.amber.operator.regex import edu.uci.ics.amber.core.tuple.Tuple import edu.uci.ics.amber.operator.filter.FilterOpExec +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.util.regex.Pattern -class RegexOpExec(regex: String, caseInsensitive: Boolean, attributeName: String) - extends FilterOpExec { +class RegexOpExec(descString: String) extends FilterOpExec { + private val desc: RegexOpDesc = objectMapper.readValue(descString, classOf[RegexOpDesc]) lazy val pattern: Pattern = - Pattern.compile(regex, if (caseInsensitive) Pattern.CASE_INSENSITIVE else 0) + Pattern.compile(desc.regex, if (desc.caseInsensitive) Pattern.CASE_INSENSITIVE else 0) this.setFilterFunc(this.matchRegex) private def matchRegex(tuple: Tuple): Boolean = - Option[Any](tuple.getField(attributeName).toString) + Option[Any](tuple.getField(desc.attribute).toString) .map(_.toString) .exists(value => pattern.matcher(value).find) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala index 79ab7eadf1e..cc1840609bf 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala @@ -2,16 +2,14 @@ package edu.uci.ics.amber.operator.reservoirsampling import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} -import edu.uci.ics.amber.operator.util.OperatorDescriptorUtils.equallyPartitionGoal - -import scala.util.Random +import edu.uci.ics.amber.util.JSONUtils.objectMapper class ReservoirSamplingOpDesc extends LogicalOp { @@ -28,12 +26,9 @@ class ReservoirSamplingOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((idx, workerCount) => - new ReservoirSamplingOpExec( - idx, - equallyPartitionGoal(k, workerCount), - Array.fill(workerCount)(Random.nextInt()) - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.reservoirsampling.ReservoirSamplingOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpExec.scala index 9e7f7c8cc14..7382e410dc9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpExec.scala @@ -2,22 +2,27 @@ package edu.uci.ics.amber.operator.reservoirsampling import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} +import edu.uci.ics.amber.operator.util.OperatorDescriptorUtils.equallyPartitionGoal +import edu.uci.ics.amber.util.JSONUtils.objectMapper import scala.util.Random -class ReservoirSamplingOpExec(actor: Int, kPerActor: Int => Int, seedFunc: Int => Int) +class ReservoirSamplingOpExec(descString: String, idx: Int, workerCount: Int) extends OperatorExecutor { + private val desc: ReservoirSamplingOpDesc = + objectMapper.readValue(descString, classOf[ReservoirSamplingOpDesc]) + private val count: Int = equallyPartitionGoal(desc.k, workerCount)(idx) private var n: Int = 0 - private val reservoir: Array[Tuple] = Array.ofDim(kPerActor(actor)) - private val rand: Random = new Random(seedFunc(actor)) + private val reservoir: Array[Tuple] = Array.ofDim(count) + private val rand: Random = new Random(workerCount) override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = { - if (n < kPerActor(actor)) { + if (n < count) { reservoir(n) = tuple } else { val i = rand.nextInt(n) - if (i < kPerActor(actor)) { + if (i < count) { reservoir(i) = tuple } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala index 247e7893f23..815b08bdb2a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala @@ -2,12 +2,13 @@ package edu.uci.ics.amber.operator.sentiment import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaInject -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -46,7 +47,10 @@ class SentimentAnalysisOpDesc extends MapOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new SentimentAnalysisOpExec(attribute)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.sentiment.SentimentAnalysisOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpExec.java b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpExec.java index 038e12e461f..df907bae693 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpExec.java +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpExec.java @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.sentiment; +import com.fasterxml.jackson.core.JsonProcessingException; import edu.stanford.nlp.ling.CoreAnnotations; import edu.stanford.nlp.neural.rnn.RNNCoreAnnotations; import edu.stanford.nlp.pipeline.Annotation; @@ -10,6 +11,7 @@ import edu.uci.ics.amber.core.tuple.Tuple; import edu.uci.ics.amber.core.tuple.TupleLike; import edu.uci.ics.amber.operator.map.MapOpExec; +import edu.uci.ics.amber.util.JSONUtils; import scala.Function1; import java.io.Serializable; @@ -19,8 +21,9 @@ public class SentimentAnalysisOpExec extends MapOpExec { private final String attributeName; private final StanfordCoreNLPWrapper coreNlp; - public SentimentAnalysisOpExec(String attributeName) { - this.attributeName = attributeName; + public SentimentAnalysisOpExec(String descString) throws JsonProcessingException { + SentimentAnalysisOpDesc desc = JSONUtils.objectMapper().readValue(descString, SentimentAnalysisOpDesc.class); + this.attributeName = desc.attribute(); Properties props = new Properties(); props.setProperty("annotators", "tokenize, ssplit, parse, sentiment"); coreNlp = new StanfordCoreNLPWrapper(props); diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/ProgressiveSinkOpExec.scala similarity index 89% rename from core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala rename to core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/ProgressiveSinkOpExec.scala index bd0bd187c23..9c9b2e7fad1 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/managed/ProgressiveSinkOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sink/ProgressiveSinkOpExec.scala @@ -1,10 +1,9 @@ -package edu.uci.ics.amber.operator.sink.managed +package edu.uci.ics.amber.operator.sink import edu.uci.ics.amber.core.executor.SinkOperatorExecutor import edu.uci.ics.amber.core.storage.model.BufferedItemWriter import edu.uci.ics.amber.core.storage.result.ResultStorage import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} -import edu.uci.ics.amber.operator.sink.ProgressiveUtils import edu.uci.ics.amber.core.virtualidentity.WorkflowIdentity import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.PortIdentity @@ -15,7 +14,10 @@ class ProgressiveSinkOpExec( workflowIdentity: WorkflowIdentity ) extends SinkOperatorExecutor { val writer: BufferedItemWriter[Tuple] = - ResultStorage.getOpResultStorage(workflowIdentity).get(storageKey).writer() + ResultStorage + .getOpResultStorage(workflowIdentity) + .get(storageKey) + .writer() override def open(): Unit = { writer.open() diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala index 73366908cb1..6c06d95dadc 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala @@ -3,12 +3,13 @@ package edu.uci.ics.amber.operator.sortPartitions import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalOp, RangePartition} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -48,15 +49,9 @@ class SortPartitionsOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo(opExecFunc = - (idx, workerCount) => - new SortPartitionOpExec( - sortAttributeName, - idx, - domainMin, - domainMax, - workerCount - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.sortPartitions.SortPartitionsOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExec.scala similarity index 77% rename from core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionOpExec.scala rename to core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExec.scala index fd52c34d0d0..df773dac8e4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExec.scala @@ -2,17 +2,15 @@ package edu.uci.ics.amber.operator.sortPartitions import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{AttributeType, Tuple, TupleLike} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import scala.collection.mutable.ArrayBuffer -class SortPartitionOpExec( - sortAttributeName: String, - localIdx: Int, - domainMin: Long, - domainMax: Long, - numberOfWorkers: Int +class SortPartitionsOpExec( + descString: String ) extends OperatorExecutor { - + private val desc: SortPartitionsOpDesc = + objectMapper.readValue(descString, classOf[SortPartitionsOpDesc]) private var unorderedTuples: ArrayBuffer[Tuple] = _ private def sortTuples(): Iterator[TupleLike] = unorderedTuples.sortWith(compareTuples).iterator @@ -25,8 +23,8 @@ class SortPartitionOpExec( override def onFinish(port: Int): Iterator[TupleLike] = sortTuples() private def compareTuples(t1: Tuple, t2: Tuple): Boolean = { - val attributeType = t1.getSchema.getAttribute(sortAttributeName).getType - val attributeIndex = t1.getSchema.getIndex(sortAttributeName) + val attributeType = t1.getSchema.getAttribute(desc.sortAttributeName).getType + val attributeIndex = t1.getSchema.getIndex(desc.sortAttributeName) attributeType match { case AttributeType.LONG => t1.getField[Long](attributeIndex) < t2.getField[Long](attributeIndex) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/SourceOperatorDescriptor.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/SourceOperatorDescriptor.scala index 3d2d18eac50..ad36af84dbb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/SourceOperatorDescriptor.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/SourceOperatorDescriptor.scala @@ -6,11 +6,10 @@ import edu.uci.ics.amber.operator.LogicalOp abstract class SourceOperatorDescriptor extends LogicalOp { + def sourceSchema(): Schema + override def getOutputSchema(schemas: Array[Schema]): Schema = { Preconditions.checkArgument(schemas.isEmpty) sourceSchema() } - - def sourceSchema(): Schema - } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/TwitterSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/TwitterSourceOpExec.scala index 8d9013d7f66..67eed71cd21 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/TwitterSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/TwitterSourceOpExec.scala @@ -1,14 +1,15 @@ package edu.uci.ics.amber.operator.source.apis.twitter import edu.uci.ics.amber.core.executor.SourceOperatorExecutor +import edu.uci.ics.amber.util.JSONUtils.objectMapper import io.github.redouane59.twitter.TwitterClient import io.github.redouane59.twitter.signature.TwitterCredentials abstract class TwitterSourceOpExec( - apiKey: String, - apiSecretKey: String, - stopWhenRateLimited: Boolean + descString: String ) extends SourceOperatorExecutor { + private val desc: TwitterSourceOpDesc = + objectMapper.readValue(descString, classOf[TwitterSourceOpDesc]) // batch size for each API request defined by Twitter // 500 is the maximum tweets for each request val TWITTER_API_BATCH_SIZE_MAX = 500 @@ -28,11 +29,11 @@ abstract class TwitterSourceOpExec( twitterClient = new TwitterClient( TwitterCredentials .builder() - .apiKey(apiKey) - .apiSecretKey(apiSecretKey) + .apiKey(desc.apiKey) + .apiSecretKey(desc.apiSecretKey) .build() ) - twitterClient.setAutomaticRetry(!stopWhenRateLimited) + twitterClient.setAutomaticRetry(!desc.stopWhenRateLimited) } override def close(): Unit = {} diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala index 5b17a08d23a..c3a92cbcadd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala @@ -6,11 +6,12 @@ import com.kjetland.jackson.jsonSchema.annotations.{ JsonSchemaInject, JsonSchemaTitle } -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.apis.twitter.TwitterSourceOpDesc +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} class TwitterFullArchiveSearchSourceOpDesc extends TwitterSourceOpDesc { @@ -49,17 +50,9 @@ class TwitterFullArchiveSearchSourceOpDesc extends TwitterSourceOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => - new TwitterFullArchiveSearchSourceOpExec( - apiKey, - apiSecretKey, - stopWhenRateLimited, - searchQuery, - limit, - fromDateTime, - toDateTime, - () => sourceSchema() - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.apis.twitter.v2.TwitterFullArchiveSearchSourceOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpExec.scala index 023a3f5337f..af9fd0f2e3b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpExec.scala @@ -3,6 +3,7 @@ package edu.uci.ics.amber.operator.source.apis.twitter.v2 import edu.uci.ics.amber.core.tuple.{Schema, Tuple, TupleLike} import edu.uci.ics.amber.operator.source.apis.twitter.TwitterSourceOpExec import edu.uci.ics.amber.operator.source.apis.twitter.v2.TwitterUtils.tweetDataToTuple +import edu.uci.ics.amber.util.JSONUtils.objectMapper import io.github.redouane59.twitter.dto.endpoints.AdditionalParameters import io.github.redouane59.twitter.dto.tweet.TweetList import io.github.redouane59.twitter.dto.tweet.TweetV2.TweetData @@ -15,18 +16,11 @@ import scala.collection.{Iterator, mutable} import scala.jdk.CollectionConverters.ListHasAsScala class TwitterFullArchiveSearchSourceOpExec( - apiKey: String, - apiSecretKey: String, - stopWhenRateLimited: Boolean, - searchQuery: String, - limit: Int, - fromDateTime: String, - toDateTime: String, - schemaFunc: () => Schema -) extends TwitterSourceOpExec(apiKey, apiSecretKey, stopWhenRateLimited) { - val outputSchema: Schema = schemaFunc() - - var curLimit: Int = limit + descString: String +) extends TwitterSourceOpExec(descString) { + private val desc: TwitterFullArchiveSearchSourceOpDesc = + objectMapper.readValue(descString, classOf[TwitterFullArchiveSearchSourceOpDesc]) + var curLimit: Int = desc.limit // nextToken is used to retrieve next page of results, if exists. var nextToken: String = _ // contains tweets from the previous request. @@ -34,6 +28,7 @@ class TwitterFullArchiveSearchSourceOpExec( var userCache: Map[String, UserData] = Map() var hasNextRequest: Boolean = curLimit > 0 var lastQueryTime: Long = 0 + val schema: Schema = desc.sourceSchema() override def produceTuple(): Iterator[TupleLike] = new Iterator[TupleLike]() { @@ -43,9 +38,9 @@ class TwitterFullArchiveSearchSourceOpExec( // if the current cache is exhausted, query for the next response if (tweetCache.isEmpty && hasNextRequest) { queryForNextBatch( - searchQuery, - LocalDateTime.parse(fromDateTime, DateTimeFormatter.ISO_DATE_TIME), - LocalDateTime.parse(toDateTime, DateTimeFormatter.ISO_DATE_TIME), + desc.searchQuery, + LocalDateTime.parse(desc.fromDateTime, DateTimeFormatter.ISO_DATE_TIME), + LocalDateTime.parse(desc.toDateTime, DateTimeFormatter.ISO_DATE_TIME), curLimit.min(TWITTER_API_BATCH_SIZE_MAX) ) } @@ -65,7 +60,7 @@ class TwitterFullArchiveSearchSourceOpExec( val user = userCache.get(tweet.getAuthorId) - tweetDataToTuple(tweet, user, outputSchema) + tweetDataToTuple(tweet, user, schema) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala index 39d5bd697bb..15b0ddfaf21 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala @@ -6,11 +6,12 @@ import com.kjetland.jackson.jsonSchema.annotations.{ JsonSchemaInject, JsonSchemaTitle } -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.apis.twitter.TwitterSourceOpDesc +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} class TwitterSearchSourceOpDesc extends TwitterSourceOpDesc { @@ -39,15 +40,9 @@ class TwitterSearchSourceOpDesc extends TwitterSourceOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => - new TwitterSearchSourceOpExec( - apiKey, - apiSecretKey, - stopWhenRateLimited, - searchQuery, - limit, - () => sourceSchema() - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.apis.twitter.v2.TwitterSearchSourceOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpExec.scala index 27522c99103..198c22e184b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpExec.scala @@ -3,6 +3,7 @@ package edu.uci.ics.amber.operator.source.apis.twitter.v2 import edu.uci.ics.amber.core.tuple.{Schema, Tuple, TupleLike} import edu.uci.ics.amber.operator.source.apis.twitter.TwitterSourceOpExec import edu.uci.ics.amber.operator.source.apis.twitter.v2.TwitterUtils.tweetDataToTuple +import edu.uci.ics.amber.util.JSONUtils.objectMapper import io.github.redouane59.twitter.dto.endpoints.AdditionalParameters import io.github.redouane59.twitter.dto.tweet.TweetList import io.github.redouane59.twitter.dto.tweet.TweetV2.TweetData @@ -13,15 +14,11 @@ import scala.collection.{Iterator, mutable} import scala.jdk.CollectionConverters.ListHasAsScala class TwitterSearchSourceOpExec( - apiKey: String, - apiSecretKey: String, - stopWhenRateLimited: Boolean, - searchQuery: String, - limit: Int, - schemaFunc: () => Schema -) extends TwitterSourceOpExec(apiKey, apiSecretKey, stopWhenRateLimited) { - val outputSchema: Schema = schemaFunc() - var curLimit: Int = limit + descString: String +) extends TwitterSourceOpExec(descString) { + private val desc: TwitterSearchSourceOpDesc = + objectMapper.readValue(descString, classOf[TwitterSearchSourceOpDesc]) + var curLimit: Int = desc.limit // nextToken is used to retrieve next page of results, if exists. var nextToken: String = _ // contains tweets from the previous request. @@ -29,6 +26,7 @@ class TwitterSearchSourceOpExec( var userCache: Map[String, UserData] = Map() var hasNextRequest: Boolean = curLimit > 0 var lastQueryTime: Long = 0 + val schema: Schema = desc.sourceSchema() override def produceTuple(): Iterator[TupleLike] = new Iterator[TupleLike]() { @@ -38,7 +36,7 @@ class TwitterSearchSourceOpExec( // if the current cache is exhausted, query for the next response if (tweetCache.isEmpty && hasNextRequest) { queryForNextBatch( - searchQuery, + desc.searchQuery, curLimit.min(TWITTER_API_BATCH_SIZE_MAX) ) } @@ -58,7 +56,7 @@ class TwitterSearchSourceOpExec( val user = userCache.get(tweet.getAuthorId) - tweetDataToTuple(tweet, user, outputSchema) + tweetDataToTuple(tweet, user, schema) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala index 38ce1e997e0..49f5028d718 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala @@ -2,11 +2,12 @@ package edu.uci.ics.amber.operator.source.fetcher import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.OutputPort @@ -26,7 +27,7 @@ class URLFetcherOpDesc extends SourceOperatorDescriptor { ) var decodingMethod: DecodingMethod = _ - def sourceSchema(): Schema = { + override def sourceSchema(): Schema = { Schema .builder() .add( @@ -49,7 +50,10 @@ class URLFetcherOpDesc extends SourceOperatorDescriptor { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new URLFetcherOpExec(url, decodingMethod)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.fetcher.URLFetcherOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpExec.scala index 8c61c9cabad..5c519f45aec 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpExec.scala @@ -3,21 +3,22 @@ package edu.uci.ics.amber.operator.source.fetcher import edu.uci.ics.amber.core.executor.SourceOperatorExecutor import edu.uci.ics.amber.core.tuple.TupleLike import edu.uci.ics.amber.operator.source.fetcher.URLFetchUtil.getInputStreamFromURL +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.apache.commons.io.IOUtils import java.net.URL -class URLFetcherOpExec(url: String, decodingMethod: DecodingMethod) extends SourceOperatorExecutor { - +class URLFetcherOpExec(descString: String) extends SourceOperatorExecutor { + private val desc: URLFetcherOpDesc = objectMapper.readValue(descString, classOf[URLFetcherOpDesc]) override def produceTuple(): Iterator[TupleLike] = { - val urlObj = new URL(url) + val urlObj = new URL(desc.url) val input = getInputStreamFromURL(urlObj) val contentInputStream = input match { case Some(value) => value - case None => IOUtils.toInputStream(s"Fetch failed for URL: $url", "UTF-8") + case None => IOUtils.toInputStream(s"Fetch failed for URL: $desc.url", "UTF-8") } - Iterator(if (decodingMethod == DecodingMethod.UTF_8) { + Iterator(if (desc.decodingMethod == DecodingMethod.UTF_8) { TupleLike(IOUtils.toString(contentInputStream, "UTF-8")) } else { TupleLike(IOUtils.toByteArray(contentInputStream)) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala index 5902e0e030c..90c65c87eb9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala @@ -6,12 +6,13 @@ import com.kjetland.jackson.jsonSchema.annotations.{ JsonSchemaString, JsonSchemaTitle } -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.annotations.HideAnnotation import edu.uci.ics.amber.operator.source.scan.text.TextSourceOpDesc -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.util.JSONUtils.objectMapper @JsonIgnoreProperties(value = Array("limit", "offset", "fileEncoding")) class FileScanSourceOpDesc extends ScanSourceOpDesc with TextSourceOpDesc { @@ -52,26 +53,19 @@ class FileScanSourceOpDesc extends ScanSourceOpDesc with TextSourceOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => - new FileScanSourceOpExec( - fileUri.get, - attributeType, - encoding, - extract, - outputFileName, - fileScanLimit, - fileScanOffset - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.scan.FileScanSourceOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> inferSchema())) + SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> sourceSchema())) ) } - override def inferSchema(): Schema = { + override def sourceSchema(): Schema = { val builder = Schema.builder() if (outputFileName) builder.add(new Attribute("filename", AttributeType.STRING)) builder.add(new Attribute(attributeName, attributeType.getType)).build() diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpExec.scala index 9bd68c67f16..1d786d31d3a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpExec.scala @@ -4,6 +4,7 @@ import edu.uci.ics.amber.core.executor.SourceOperatorExecutor import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.parseField import edu.uci.ics.amber.core.tuple.TupleLike +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.apache.commons.compress.archivers.{ArchiveInputStream, ArchiveStreamFactory} import org.apache.commons.io.IOUtils.toByteArray @@ -13,21 +14,17 @@ import scala.collection.mutable import scala.jdk.CollectionConverters.IteratorHasAsScala class FileScanSourceOpExec private[scan] ( - fileUri: String, - fileAttributeType: FileAttributeType, - fileEncoding: FileDecodingMethod, - extract: Boolean, - outputFileName: Boolean, - fileScanLimit: Option[Int] = None, - fileScanOffset: Option[Int] = None + descString: String ) extends SourceOperatorExecutor { + private val desc: FileScanSourceOpDesc = + objectMapper.readValue(descString, classOf[FileScanSourceOpDesc]) @throws[IOException] override def produceTuple(): Iterator[TupleLike] = { var filenameIt: Iterator[String] = Iterator.empty val fileEntries: Iterator[InputStream] = { - val is = DocumentFactory.newReadonlyDocument(new URI(fileUri)).asInputStream() - if (extract) { + val is = DocumentFactory.newReadonlyDocument(new URI(desc.fileName.get)).asInputStream() + if (desc.extract) { val inputStream: ArchiveInputStream = new ArchiveStreamFactory().createArchiveInputStream( new BufferedInputStream(is) ) @@ -43,34 +40,34 @@ class FileScanSourceOpExec private[scan] ( } } - if (fileAttributeType.isSingle) { + if (desc.attributeType.isSingle) { fileEntries.zipAll(filenameIt, null, null).map { case (entry, fileName) => val fields: mutable.ListBuffer[Any] = mutable.ListBuffer() - if (outputFileName) { + if (desc.outputFileName) { fields.addOne(fileName) } - fields.addOne(fileAttributeType match { + fields.addOne(desc.attributeType match { case FileAttributeType.SINGLE_STRING => - new String(toByteArray(entry), fileEncoding.getCharset) - case _ => parseField(toByteArray(entry), fileAttributeType.getType) + new String(toByteArray(entry), desc.fileEncoding.getCharset) + case _ => parseField(toByteArray(entry), desc.attributeType.getType) }) TupleLike(fields.toSeq: _*) } } else { fileEntries.flatMap(entry => - new BufferedReader(new InputStreamReader(entry, fileEncoding.getCharset)) + new BufferedReader(new InputStreamReader(entry, desc.fileEncoding.getCharset)) .lines() .iterator() .asScala .slice( - fileScanOffset.getOrElse(0), - fileScanOffset.getOrElse(0) + fileScanLimit.getOrElse(Int.MaxValue) + desc.fileScanOffset.getOrElse(0), + desc.fileScanOffset.getOrElse(0) + desc.fileScanLimit.getOrElse(Int.MaxValue) ) .map(line => { - TupleLike(fileAttributeType match { + TupleLike(desc.attributeType match { case FileAttributeType.SINGLE_STRING => line - case _ => parseField(line, fileAttributeType.getType) + case _ => parseField(line, desc.attributeType.getType) }) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala index 919c35f6cb4..db2fefdac60 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/ScanSourceOpDesc.scala @@ -3,6 +3,7 @@ package edu.uci.ics.amber.operator.source.scan import com.fasterxml.jackson.annotation.{JsonIgnore, JsonProperty, JsonPropertyDescription} import com.fasterxml.jackson.databind.annotation.JsonDeserialize import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle +import edu.uci.ics.amber.core.storage.FileResolver import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor @@ -29,10 +30,6 @@ abstract class ScanSourceOpDesc extends SourceOperatorDescriptor { @JsonPropertyDescription("decoding charset to use on input") var fileEncoding: FileDecodingMethod = FileDecodingMethod.UTF_8 - // uri of the file - @JsonIgnore - var fileUri: Option[String] = None - @JsonIgnore var fileTypeName: Option[String] = None @@ -48,10 +45,7 @@ abstract class ScanSourceOpDesc extends SourceOperatorDescriptor { @JsonDeserialize(contentAs = classOf[Int]) var offset: Option[Int] = None - override def sourceSchema(): Schema = { - if (fileUri.isEmpty) return null - inferSchema() - } + override def sourceSchema(): Schema = null override def operatorInfo: OperatorInfo = { OperatorInfo( @@ -63,12 +57,12 @@ abstract class ScanSourceOpDesc extends SourceOperatorDescriptor { ) } - def inferSchema(): Schema - - def setFileUri(uri: URI): Unit = { - fileUri = Some(uri.toASCIIString) + def setResolvedFileName(uri: URI): Unit = { + fileName = Some(uri.toASCIIString) } override def equals(that: Any): Boolean = EqualsBuilder.reflectionEquals(this, that, "context", "fileHandle") + + def fileResolved(): Boolean = fileName.isDefined && FileResolver.isFileResolved(fileName.get) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpDesc.scala index 96f48fdb75d..135ae2b6657 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpDesc.scala @@ -1,13 +1,14 @@ package edu.uci.ics.amber.operator.source.scan.arrow import com.fasterxml.jackson.annotation.JsonIgnoreProperties -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.util.ArrowUtils +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.io.IOException import java.net.URI @@ -16,6 +17,7 @@ import java.nio.file.StandardOpenOption import org.apache.arrow.memory.RootAllocator import org.apache.arrow.vector.ipc.ArrowFileReader import org.apache.arrow.vector.types.pojo.{Schema => ArrowSchema} + import scala.util.Using @JsonIgnoreProperties(value = Array("fileEncoding")) @@ -33,7 +35,10 @@ class ArrowSourceOpDesc extends ScanSourceOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => createArrowSourceOpExec()) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.scan.arrow.ArrowSourceOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) @@ -42,15 +47,6 @@ class ArrowSourceOpDesc extends ScanSourceOpDesc { ) } - private def createArrowSourceOpExec() = { - new ArrowSourceOpExec( - fileUri.get, - limit, - offset, - schemaFunc = () => sourceSchema() - ) - } - /** * Infer Texera.Schema based on the top few lines of data. * @@ -58,7 +54,7 @@ class ArrowSourceOpDesc extends ScanSourceOpDesc { */ @Override def inferSchema(): Schema = { - val file = DocumentFactory.newReadonlyDocument(new URI(fileUri.get)).asFile() + val file = DocumentFactory.newReadonlyDocument(new URI(fileName.get)).asFile() val allocator = new RootAllocator() Using diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpExec.scala index d359b847f8a..548d0734bb8 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/arrow/ArrowSourceOpExec.scala @@ -2,37 +2,33 @@ package edu.uci.ics.amber.operator.source.scan.arrow import edu.uci.ics.amber.core.executor.SourceOperatorExecutor import edu.uci.ics.amber.core.storage.DocumentFactory +import edu.uci.ics.amber.core.tuple.TupleLike +import edu.uci.ics.amber.util.ArrowUtils +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.apache.arrow.memory.RootAllocator import org.apache.arrow.vector.VectorSchemaRoot import org.apache.arrow.vector.ipc.ArrowFileReader -import edu.uci.ics.amber.core.tuple.{Schema, TupleLike} -import edu.uci.ics.amber.util.ArrowUtils import java.net.URI -import java.nio.file.{Files} -import java.nio.file.StandardOpenOption +import java.nio.file.{Files, StandardOpenOption} class ArrowSourceOpExec( - fileUri: String, - limit: Option[Int], - offset: Option[Int], - schemaFunc: () => Schema + descString: String ) extends SourceOperatorExecutor { - + private val desc: ArrowSourceOpDesc = + objectMapper.readValue(descString, classOf[ArrowSourceOpDesc]) private var reader: Option[ArrowFileReader] = None private var root: Option[VectorSchemaRoot] = None - private var schema: Option[Schema] = None private var allocator: Option[RootAllocator] = None override def open(): Unit = { try { - val file = DocumentFactory.newReadonlyDocument(new URI(fileUri)).asFile() + val file = DocumentFactory.newReadonlyDocument(new URI(desc.fileName.get)).asFile() val alloc = new RootAllocator() allocator = Some(alloc) val channel = Files.newByteChannel(file.toPath, StandardOpenOption.READ) val arrowReader = new ArrowFileReader(channel, alloc) val vectorRoot = arrowReader.getVectorSchemaRoot - schema = Some(schemaFunc()) reader = Some(arrowReader) root = Some(vectorRoot) } catch { @@ -73,8 +69,8 @@ class ArrowSourceOpExec( } } - var tupleIterator = rowIterator.drop(offset.getOrElse(0)) - if (limit.isDefined) tupleIterator = tupleIterator.take(limit.get) + var tupleIterator = rowIterator.drop(desc.offset.getOrElse(0)) + if (desc.limit.isDefined) tupleIterator = tupleIterator.take(desc.limit.get) tupleIterator } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala index 6d9fc7a5d22..cd2fdda4bdf 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala @@ -1,15 +1,15 @@ package edu.uci.ics.amber.operator.source.scan.csv -import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import com.fasterxml.jackson.databind.annotation.JsonDeserialize +import com.fasterxml.jackson.annotation.{JsonInclude, JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import com.univocity.parsers.csv.{CsvFormat, CsvParser, CsvParserSettings} -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.inferSchemaFromRows import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import java.io.{IOException, InputStreamReader} @@ -20,7 +20,7 @@ class CSVScanSourceOpDesc extends ScanSourceOpDesc { @JsonProperty(defaultValue = ",") @JsonSchemaTitle("Delimiter") @JsonPropertyDescription("delimiter to separate each line into fields") - @JsonDeserialize(contentAs = classOf[java.lang.String]) + @JsonInclude(JsonInclude.Include.NON_ABSENT) var customDelimiter: Option[String] = None @JsonProperty(defaultValue = "true") @@ -36,45 +36,32 @@ class CSVScanSourceOpDesc extends ScanSourceOpDesc { executionId: ExecutionIdentity ): PhysicalOp = { // fill in default values - if (customDelimiter.isEmpty || customDelimiter.get.isEmpty) + if (customDelimiter.isEmpty || customDelimiter.get.isEmpty) { customDelimiter = Option(",") + } PhysicalOp .sourcePhysicalOp( workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => - new CSVScanSourceOpExec( - fileUri.get, - fileEncoding, - limit, - offset, - customDelimiter, - hasHeader, - schemaFunc = () => sourceSchema() - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.scan.csv.CSVScanSourceOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> inferSchema())) + SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> sourceSchema())) ) } - /** - * Infer Texera.Schema based on the top few lines of data. - * - * @return Texera.Schema build for this operator - */ - @Override - def inferSchema(): Schema = { - if (customDelimiter.isEmpty || fileUri.isEmpty) { + override def sourceSchema(): Schema = { + if (customDelimiter.isEmpty || !fileResolved()) { return null } - - val stream = DocumentFactory.newReadonlyDocument(new URI(fileUri.get)).asInputStream() + val stream = DocumentFactory.newReadonlyDocument(new URI(fileName.get)).asInputStream() val inputReader = new InputStreamReader(stream, fileEncoding.getCharset) @@ -111,6 +98,7 @@ class CSVScanSourceOpDesc extends ScanSourceOpDesc { .builder() .add(header.indices.map(i => new Attribute(header(i), attributeTypeList(i)))) .build() + } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpExec.scala index b22182f47f9..a111219440c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpExec.scala @@ -3,25 +3,17 @@ package edu.uci.ics.amber.operator.source.scan.csv import com.univocity.parsers.csv.{CsvFormat, CsvParser, CsvParserSettings} import edu.uci.ics.amber.core.executor.SourceOperatorExecutor import edu.uci.ics.amber.core.storage.DocumentFactory -import edu.uci.ics.amber.core.tuple.{AttributeTypeUtils, Schema, TupleLike} -import edu.uci.ics.amber.operator.source.scan.FileDecodingMethod +import edu.uci.ics.amber.core.tuple.{AttributeTypeUtils, TupleLike} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.io.InputStreamReader import java.net.URI import scala.collection.immutable.ArraySeq -class CSVScanSourceOpExec private[csv] ( - fileUri: String, - fileEncoding: FileDecodingMethod, - limit: Option[Int], - offset: Option[Int], - customDelimiter: Option[String], - hasHeader: Boolean, - schemaFunc: () => Schema -) extends SourceOperatorExecutor { +class CSVScanSourceOpExec private[csv] (descString: String) extends SourceOperatorExecutor { + val desc: CSVScanSourceOpDesc = objectMapper.readValue(descString, classOf[CSVScanSourceOpDesc]) var inputReader: InputStreamReader = _ var parser: CsvParser = _ - var schema: Schema = _ var nextRow: Array[String] = _ var numRowGenerated = 0 @@ -45,12 +37,12 @@ class CSVScanSourceOpExec private[csv] ( } var tupleIterator = rowIterator - .drop(offset.getOrElse(0)) + .drop(desc.offset.getOrElse(0)) .map(row => { try { TupleLike( ArraySeq.unsafeWrapArray( - AttributeTypeUtils.parseFields(row.asInstanceOf[Array[Any]], schema) + AttributeTypeUtils.parseFields(row.asInstanceOf[Array[Any]], desc.sourceSchema()) ): _* ) } catch { @@ -59,19 +51,19 @@ class CSVScanSourceOpExec private[csv] ( }) .filter(t => t != null) - if (limit.isDefined) tupleIterator = tupleIterator.take(limit.get) + if (desc.limit.isDefined) tupleIterator = tupleIterator.take(desc.limit.get) tupleIterator } override def open(): Unit = { inputReader = new InputStreamReader( - DocumentFactory.newReadonlyDocument(new URI(fileUri)).asInputStream(), - fileEncoding.getCharset + DocumentFactory.newReadonlyDocument(new URI(desc.fileName.get)).asInputStream(), + desc.fileEncoding.getCharset ) val csvFormat = new CsvFormat() - csvFormat.setDelimiter(customDelimiter.get.charAt(0)) + csvFormat.setDelimiter(desc.customDelimiter.get.charAt(0)) csvFormat.setLineSeparator("\n") csvFormat.setComment( '\u0000' @@ -79,12 +71,10 @@ class CSVScanSourceOpExec private[csv] ( val csvSetting = new CsvParserSettings() csvSetting.setMaxCharsPerColumn(-1) csvSetting.setFormat(csvFormat) - csvSetting.setHeaderExtractionEnabled(hasHeader) + csvSetting.setHeaderExtractionEnabled(desc.hasHeader) parser = new CsvParser(csvSetting) parser.beginParsing(inputReader) - - schema = schemaFunc() } override def close(): Unit = { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala index eb50cbe0910..4d4202da703 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala @@ -4,13 +4,14 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.fasterxml.jackson.databind.annotation.JsonDeserialize import com.github.tototoshi.csv.{CSVReader, DefaultCSVFormat} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.inferSchemaFromRows import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.io.IOException import java.net.URI @@ -36,54 +37,33 @@ class ParallelCSVScanSourceOpDesc extends ScanSourceOpDesc { executionId: ExecutionIdentity ): PhysicalOp = { // fill in default values - if (customDelimiter.get.isEmpty) + if (customDelimiter.get.isEmpty) { customDelimiter = Option(",") - - // here, the stream requires to be seekable, so datasetFileDesc creates a temp file here - // TODO: consider a better way - val file = DocumentFactory.newReadonlyDocument(new URI(fileUri.get)).asFile() - val totalBytes: Long = file.length() + } PhysicalOp .sourcePhysicalOp( workflowId, executionId, operatorIdentifier, - OpExecInitInfo((idx, workerCount) => { - // TODO: add support for limit - // TODO: add support for offset - val startOffset: Long = totalBytes / workerCount * idx - val endOffset: Long = - if (idx != workerCount - 1) totalBytes / workerCount * (idx + 1) else totalBytes - new ParallelCSVScanSourceOpExec( - file, - customDelimiter, - hasHeader, - startOffset, - endOffset, - schemaFunc = () => sourceSchema() - ) - }) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.scan.csv.ParallelCSVScanSourceOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withParallelizable(true) .withPropagateSchema( - SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> inferSchema())) + SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> sourceSchema())) ) } - /** - * Infer Texera.Schema based on the top few lines of data. - * - * @return Texera.Schema build for this operator - */ - @Override - def inferSchema(): Schema = { - if (customDelimiter.isEmpty || fileUri.isEmpty) { + override def sourceSchema(): Schema = { + if (customDelimiter.isEmpty || !fileResolved()) { return null } - val file = DocumentFactory.newReadonlyDocument(new URI(fileUri.get)).asFile() + val file = DocumentFactory.newReadonlyDocument(new URI(fileName.get)).asFile() implicit object CustomFormat extends DefaultCSVFormat { override val delimiter: Char = customDelimiter.get.charAt(0) @@ -118,6 +98,7 @@ class ParallelCSVScanSourceOpDesc extends ScanSourceOpDesc { ) ) .build() + } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpExec.scala index 59ba08b0169..9bdde254c79 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpExec.scala @@ -1,24 +1,24 @@ package edu.uci.ics.amber.operator.source.scan.csv import edu.uci.ics.amber.core.executor.SourceOperatorExecutor -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeTypeUtils, Schema, TupleLike} +import edu.uci.ics.amber.core.storage.DocumentFactory +import edu.uci.ics.amber.core.tuple.{Attribute, AttributeTypeUtils, TupleLike} import edu.uci.ics.amber.operator.source.BufferedBlockReader +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.tukaani.xz.SeekableFileInputStream -import java.io.File +import java.net.URI import java.util import java.util.stream.{IntStream, Stream} import scala.collection.compat.immutable.ArraySeq class ParallelCSVScanSourceOpExec private[csv] ( - file: File, - customDelimiter: Option[String], - hasHeader: Boolean, - startOffset: Long, - endOffset: Long, - schemaFunc: () => Schema + descString: String, + idx: Int = 0, + workerCount: Int = 1 ) extends SourceOperatorExecutor { - private var schema: Schema = _ + val desc: ParallelCSVScanSourceOpDesc = + objectMapper.readValue(descString, classOf[ParallelCSVScanSourceOpDesc]) private var reader: BufferedBlockReader = _ override def produceTuple(): Iterator[TupleLike] = @@ -42,6 +42,7 @@ class ParallelCSVScanSourceOpExec private[csv] ( return null } + val schema = desc.sourceSchema() // however the null values won't present if omitted in the end, we need to match nulls. if (fields.length != schema.getAttributes.size) fields = Stream @@ -68,19 +69,29 @@ class ParallelCSVScanSourceOpExec private[csv] ( }.filter(tuple => tuple != null) override def open(): Unit = { + // here, the stream requires to be seekable, so datasetFileDesc creates a temp file here + // TODO: consider a better way + val file = DocumentFactory.newReadonlyDocument(new URI(desc.fileName.get)).asFile() + val totalBytes: Long = file.length() + // TODO: add support for limit + // TODO: add support for offset + val startOffset: Long = totalBytes / workerCount * idx + val endOffset: Long = + if (idx != workerCount - 1) totalBytes / workerCount * (idx + 1) else totalBytes + val stream = new SeekableFileInputStream(file) - schema = schemaFunc() + stream.seek(startOffset) reader = new BufferedBlockReader( stream, endOffset - startOffset, - customDelimiter.get.charAt(0), + desc.customDelimiter.get.charAt(0), null ) // skip line if this worker reads from middle of a file if (startOffset > 0) reader.readLine // skip line if this worker reads the start of a file, and the file has a header line - if (startOffset == 0 && hasHeader) reader.readLine + if (startOffset == 0 && desc.hasHeader) reader.readLine } override def close(): Unit = reader.close() diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala index f3b2ea1f2e6..9ea25e13147 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala @@ -1,16 +1,16 @@ package edu.uci.ics.amber.operator.source.scan.csvOld import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import com.fasterxml.jackson.databind.annotation.JsonDeserialize import com.github.tototoshi.csv.{CSVReader, DefaultCSVFormat} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.inferSchemaFromRows import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.io.IOException import java.net.URI @@ -20,8 +20,7 @@ class CSVOldScanSourceOpDesc extends ScanSourceOpDesc { @JsonProperty(defaultValue = ",") @JsonSchemaTitle("Delimiter") @JsonPropertyDescription("delimiter to separate each line into fields") - @JsonDeserialize(contentAs = classOf[java.lang.String]) - var customDelimiter: Option[String] = None + var customDelimiter: Option[String] = Some(",") @JsonProperty(defaultValue = "true") @JsonSchemaTitle("Header") @@ -36,43 +35,32 @@ class CSVOldScanSourceOpDesc extends ScanSourceOpDesc { executionId: ExecutionIdentity ): PhysicalOp = { // fill in default values - if (customDelimiter.get.isEmpty) + if (customDelimiter.get.isEmpty) { customDelimiter = Option(",") + } PhysicalOp .sourcePhysicalOp( workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => - new CSVOldScanSourceOpExec( - fileUri.get, - fileEncoding, - limit, - offset, - customDelimiter, - hasHeader, - schemaFunc = () => sourceSchema() - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.scan.csvOld.CSVOldScanSourceOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> inferSchema())) + SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> sourceSchema())) ) } - /** - * Infer Texera.Schema based on the top few lines of data. - * - * @return Texera.Schema build for this operator - */ - @Override - def inferSchema(): Schema = { - if (customDelimiter.isEmpty || fileUri.isEmpty) { + override def sourceSchema(): Schema = { + if (customDelimiter.isEmpty || !fileResolved()) { return null } - val file = DocumentFactory.newReadonlyDocument(new URI(fileUri.get)).asFile() + // infer schema from the first few lines of the file + val file = DocumentFactory.newReadonlyDocument(new URI(fileName.get)).asFile() implicit object CustomFormat extends DefaultCSVFormat { override val delimiter: Char = customDelimiter.get.charAt(0) } @@ -108,6 +96,7 @@ class CSVOldScanSourceOpDesc extends ScanSourceOpDesc { ) ) .build() + } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpExec.scala index 7a92698d426..28241ea3cf5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpExec.scala @@ -4,27 +4,22 @@ import com.github.tototoshi.csv.{CSVReader, DefaultCSVFormat} import edu.uci.ics.amber.core.executor.SourceOperatorExecutor import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.{Attribute, AttributeTypeUtils, Schema, TupleLike} -import edu.uci.ics.amber.operator.source.scan.FileDecodingMethod +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.net.URI import scala.collection.compat.immutable.ArraySeq class CSVOldScanSourceOpExec private[csvOld] ( - fileUri: String, - fileEncoding: FileDecodingMethod, - limit: Option[Int], - offset: Option[Int], - customDelimiter: Option[String], - hasHeader: Boolean, - schemaFunc: () => Schema + descString: String ) extends SourceOperatorExecutor { - var schema: Schema = _ + val desc: CSVOldScanSourceOpDesc = + objectMapper.readValue(descString, classOf[CSVOldScanSourceOpDesc]) var reader: CSVReader = _ var rows: Iterator[Seq[String]] = _ - + val schema: Schema = desc.sourceSchema() override def produceTuple(): Iterator[TupleLike] = { - var tuples = rows + val tuples = rows .map(fields => try { val parsedFields: Array[Any] = AttributeTypeUtils.parseFields( @@ -40,24 +35,27 @@ class CSVOldScanSourceOpExec private[csvOld] ( ) .filter(tuple => tuple != null) - if (limit.isDefined) tuples = tuples.take(limit.get) - tuples + if (desc.limit.isDefined) + tuples.take(desc.limit.get) + else { + tuples + } } override def open(): Unit = { - schema = schemaFunc() implicit object CustomFormat extends DefaultCSVFormat { - override val delimiter: Char = customDelimiter.get.charAt(0) + override val delimiter: Char = desc.customDelimiter.get.charAt(0) } - val filePath = DocumentFactory.newReadonlyDocument(new URI(fileUri)).asFile().toPath - reader = CSVReader.open(filePath.toString, fileEncoding.getCharset.name())(CustomFormat) + val filePath = DocumentFactory.newReadonlyDocument(new URI(desc.fileName.get)).asFile().toPath + reader = CSVReader.open(filePath.toString, desc.fileEncoding.getCharset.name())(CustomFormat) // skip line if this worker reads the start of a file, and the file has a header line - val startOffset = offset.getOrElse(0) + (if (hasHeader) 1 else 0) - + val startOffset = desc.offset.getOrElse(0) + (if (desc.hasHeader) 1 else 0) rows = reader.iterator.drop(startOffset) } override def close(): Unit = { - reader.close() + if (reader != null) { + reader.close() + } } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala index 0be43a62c39..9a9deee9bbc 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala @@ -2,15 +2,15 @@ package edu.uci.ics.amber.operator.source.scan.json import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.fasterxml.jackson.databind.JsonNode -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.storage.model.DatasetFileDocument import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.inferSchemaFromRows import edu.uci.ics.amber.core.tuple.{Attribute, Schema} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc import edu.uci.ics.amber.util.JSONUtils.{JSONToMap, objectMapper} -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import java.io._ import java.net.URI @@ -38,56 +38,30 @@ class JSONLScanSourceOpDesc extends ScanSourceOpDesc { workflowId: WorkflowIdentity, executionId: ExecutionIdentity ): PhysicalOp = { - val stream = DocumentFactory.newReadonlyDocument(new URI(fileUri.get)).asInputStream() - // count lines and partition the task to each worker - val reader = new BufferedReader( - new InputStreamReader(stream, fileEncoding.getCharset) - ) - val offsetValue = offset.getOrElse(0) - var lines = reader.lines().iterator().asScala.drop(offsetValue) - if (limit.isDefined) lines = lines.take(limit.get) - val count: Int = lines.map(_ => 1).sum - reader.close() PhysicalOp .sourcePhysicalOp( workflowId, executionId, operatorIdentifier, - OpExecInitInfo((idx, workerCount) => { - val startOffset: Int = offsetValue + count / workerCount * idx - val endOffset: Int = - offsetValue + (if (idx != workerCount - 1) count / workerCount * (idx + 1) - else count) - new JSONLScanSourceOpExec( - fileUri.get, - fileEncoding, - startOffset, - endOffset, - flatten, - schemaFunc = () => inferSchema() - ) - }) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.scan.json.JSONLScanSourceOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withParallelizable(true) .withPropagateSchema( - SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> inferSchema())) + SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> sourceSchema())) ) } - /** - * Infer Texera.Schema based on the top few lines of data. - * - * @return Texera.Schema build for this operator - */ - @Override - def inferSchema(): Schema = { - if (fileUri.isEmpty) { + override def sourceSchema(): Schema = { + if (!fileResolved()) { return null } - val stream = DocumentFactory.newReadonlyDocument(new URI(fileUri.get)).asInputStream() + val stream = DocumentFactory.newReadonlyDocument(new URI(fileName.get)).asInputStream() val reader = new BufferedReader(new InputStreamReader(stream, fileEncoding.getCharset)) var fieldNames = Set[String]() @@ -132,6 +106,6 @@ class JSONLScanSourceOpDesc extends ScanSourceOpDesc { .map(i => new Attribute(sortedFieldNames(i), attributeTypes(i))) ) .build() - } + } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpExec.scala index 221fdd57050..ec4490d6964 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpExec.scala @@ -3,8 +3,7 @@ package edu.uci.ics.amber.operator.source.scan.json import edu.uci.ics.amber.core.executor.SourceOperatorExecutor import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.parseField -import edu.uci.ics.amber.core.tuple.{Schema, TupleLike} -import edu.uci.ics.amber.operator.source.scan.FileDecodingMethod +import edu.uci.ics.amber.core.tuple.TupleLike import edu.uci.ics.amber.operator.source.scan.json.JSONUtil.JSONToMap import edu.uci.ics.amber.util.JSONUtils.objectMapper @@ -14,21 +13,20 @@ import scala.jdk.CollectionConverters.IteratorHasAsScala import scala.util.{Failure, Success, Try} class JSONLScanSourceOpExec private[json] ( - fileUri: String, - fileEncoding: FileDecodingMethod, - startOffset: Int, - endOffset: Int, - flatten: Boolean, - schemaFunc: () => Schema + descString: String, + idx: Int = 0, + workerCount: Int = 1 ) extends SourceOperatorExecutor { - private var schema: Schema = _ + private val desc: JSONLScanSourceOpDesc = + objectMapper.readValue(descString, classOf[JSONLScanSourceOpDesc]) private var rows: Iterator[String] = _ private var reader: BufferedReader = _ override def produceTuple(): Iterator[TupleLike] = { rows.flatMap { line => Try { - val data = JSONToMap(objectMapper.readTree(line), flatten).withDefaultValue(null) + val schema = desc.sourceSchema() + val data = JSONToMap(objectMapper.readTree(line), desc.flatten).withDefaultValue(null) val fields = schema.getAttributeNames.map { fieldName => parseField(data(fieldName), schema.getAttribute(fieldName).getType) } @@ -41,14 +39,23 @@ class JSONLScanSourceOpExec private[json] ( } override def open(): Unit = { - schema = schemaFunc() + val stream = DocumentFactory.newReadonlyDocument(new URI(desc.fileName.get)).asInputStream() + // count lines and partition the task to each worker reader = new BufferedReader( - new InputStreamReader( - DocumentFactory.newReadonlyDocument(new URI(fileUri)).asInputStream(), - fileEncoding.getCharset - ) + new InputStreamReader(stream, desc.fileEncoding.getCharset) ) - rows = reader.lines().iterator().asScala.slice(startOffset, endOffset) + val offsetValue = desc.offset.getOrElse(0) + var lines = reader.lines().iterator().asScala.drop(offsetValue) + if (desc.limit.isDefined) lines = lines.take(desc.limit.get) + val (it1, it2) = lines.duplicate + val count: Int = it1.map(_ => 1).sum + + val startOffset: Int = offsetValue + count / workerCount * idx + val endOffset: Int = + offsetValue + (if (idx != workerCount - 1) count / workerCount * (idx + 1) + else count) + + rows = it2.iterator.slice(startOffset, endOffset) } override def close(): Unit = reader.close() diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala index e3aaec7da42..bdb59fff827 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala @@ -2,12 +2,13 @@ package edu.uci.ics.amber.operator.source.scan.text import com.fasterxml.jackson.annotation.JsonProperty import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.OutputPort @@ -26,8 +27,9 @@ class TextInputSourceOpDesc extends SourceOperatorDescriptor with TextSourceOpDe workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => - new TextInputSourceOpExec(attributeType, textInput, fileScanLimit, fileScanOffset) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.scan.text.TextInputSourceOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpExec.scala index 104c9cae558..76260167adc 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpExec.scala @@ -4,27 +4,26 @@ import edu.uci.ics.amber.core.executor.SourceOperatorExecutor import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.parseField import edu.uci.ics.amber.core.tuple.TupleLike import edu.uci.ics.amber.operator.source.scan.FileAttributeType +import edu.uci.ics.amber.util.JSONUtils.objectMapper class TextInputSourceOpExec private[text] ( - fileAttributeType: FileAttributeType, - textInput: String, - fileScanLimit: Option[Int] = None, - fileScanOffset: Option[Int] = None + descString: String ) extends SourceOperatorExecutor { - + private val desc: TextInputSourceOpDesc = + objectMapper.readValue(descString, classOf[TextInputSourceOpDesc]) override def produceTuple(): Iterator[TupleLike] = { - (if (fileAttributeType.isSingle) { - Iterator(textInput) + (if (desc.attributeType.isSingle) { + Iterator(desc.textInput) } else { - textInput.linesIterator.slice( - fileScanOffset.getOrElse(0), - fileScanOffset.getOrElse(0) + fileScanLimit.getOrElse(Int.MaxValue) + desc.textInput.linesIterator.slice( + desc.fileScanOffset.getOrElse(0), + desc.fileScanOffset.getOrElse(0) + desc.fileScanLimit.getOrElse(Int.MaxValue) ) }).map(line => - TupleLike(fileAttributeType match { + TupleLike(desc.attributeType match { case FileAttributeType.SINGLE_STRING => line case FileAttributeType.BINARY => line.getBytes - case _ => parseField(line, fileAttributeType.getType) + case _ => parseField(line, desc.attributeType.getType) }) ) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpDesc.scala index 1eff093a236..77113ff4660 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpDesc.scala @@ -106,20 +106,7 @@ abstract class SQLSourceOpDesc extends SourceOperatorDescriptor { @BatchByColumn var interval = 0L - /** - * Make sure all the required parameters are not empty, - * then query the remote PostgreSQL server for the table schema - * - * @return Tuple.Schema - */ - override def sourceSchema(): Schema = { - if ( - this.host == null || this.port == null || this.database == null - || this.table == null || this.username == null || this.password == null - ) - return null - querySchema - } + override def sourceSchema(): Schema = querySchema // needs to define getters for sub classes to override Jackson Annotations def getKeywords: Option[String] = keywords @@ -131,7 +118,14 @@ abstract class SQLSourceOpDesc extends SourceOperatorDescriptor { * * @return Schema */ - protected def querySchema: Schema = { + private def querySchema: Schema = { + if ( + this.host == null || this.port == null || this.database == null + || this.table == null || this.username == null || this.password == null + ) { + return null + } + updatePort() val schemaBuilder = Schema.builder() try { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpExec.scala index 77bb30d7731..e232d13d254 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpExec.scala @@ -3,31 +3,18 @@ package edu.uci.ics.amber.operator.source.sql import edu.uci.ics.amber.core.executor.SourceOperatorExecutor import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.{parseField, parseTimestamp} import edu.uci.ics.amber.core.tuple._ +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.sql._ import scala.collection.mutable.ArrayBuffer import scala.util.control.Breaks.{break, breakable} -abstract class SQLSourceOpExec( - // source configs - table: String, - var curLimit: Option[Long], - var curOffset: Option[Long], - // progressiveness related - progressive: Option[Boolean], - batchByColumn: Option[String], - min: Option[String], - max: Option[String], - interval: Long, - // filter conditions: - keywordSearch: Boolean, - keywordSearchByColumn: String, - keywords: String, - schemaFunc: () => Schema -) extends SourceOperatorExecutor { - - // connection and query related +abstract class SQLSourceOpExec(descString: String) extends SourceOperatorExecutor { + val desc: SQLSourceOpDesc = objectMapper.readValue(descString, classOf[SQLSourceOpDesc]) var schema: Schema = _ + var curLimit: Option[Long] = None + var curOffset: Option[Long] = None + // connection and query related val tableNames: ArrayBuffer[String] = ArrayBuffer() var batchByAttribute: Option[Attribute] = None var connection: Connection = _ @@ -100,7 +87,7 @@ abstract class SQLSourceOpExec( // update the limit in order to adapt to progressive batches curLimit.foreach(limit => { if (limit > 0) { - curLimit = Option(limit - 1) + curLimit = Some(limit - 1) } }) return tuple @@ -141,18 +128,18 @@ abstract class SQLSourceOpExec( */ @throws[SQLException] override def open(): Unit = { - schema = schemaFunc() batchByAttribute = - if (progressive.getOrElse(false)) Option(schema.getAttribute(batchByColumn.get)) else None + if (desc.progressive.getOrElse(false)) Option(schema.getAttribute(desc.batchByColumn.get)) + else None connection = establishConn() // load user table names from the given database loadTableNames() // validates the input table name - if (!tableNames.contains(table)) - throw new RuntimeException("Can't find the given table `" + table + "`.") + if (!tableNames.contains(desc.table)) + throw new RuntimeException("Can't find the given table `" + desc.table + "`.") // load for batch column value boundaries used to split mini queries - if (progressive.getOrElse(false)) initBatchColumnBoundaries() + if (desc.progressive.getOrElse(false)) initBatchColumnBoundaries() } /** @@ -244,7 +231,7 @@ abstract class SQLSourceOpExec( } protected def addBaseSelect(queryBuilder: StringBuilder): Unit = { - queryBuilder ++= "\n" + "SELECT * FROM " + table + " where 1 = 1" + queryBuilder ++= "\n" + "SELECT * FROM " + desc.table + " where 1 = 1" } /** @@ -272,10 +259,10 @@ abstract class SQLSourceOpExec( case Some(attribute) => attribute.getType match { case AttributeType.INTEGER | AttributeType.LONG | AttributeType.TIMESTAMP => - nextLowerBound = curLowerBound.longValue + interval + nextLowerBound = curLowerBound.longValue + desc.interval isLastBatch = nextLowerBound.longValue >= upperBound.longValue case AttributeType.DOUBLE => - nextLowerBound = curLowerBound.doubleValue + interval + nextLowerBound = curLowerBound.doubleValue + desc.interval isLastBatch = nextLowerBound.doubleValue >= upperBound.doubleValue case AttributeType.BOOLEAN | AttributeType.STRING | AttributeType.ANY | _ => throw new IllegalArgumentException("Unexpected type: " + attribute.getType) @@ -289,7 +276,7 @@ abstract class SQLSourceOpExec( " < " + batchAttributeToString(nextLowerBound)) case None => throw new IllegalArgumentException( - "no valid batchByColumn to iterate: " + batchByColumn.getOrElse("") + "no valid batchByColumn to iterate: " + desc.batchByColumn.getOrElse("") ) } curLowerBound = nextLowerBound @@ -316,7 +303,7 @@ abstract class SQLSourceOpExec( } case None => throw new IllegalArgumentException( - "No valid batchByColumn to iterate: " + batchByColumn.getOrElse("") + "No valid batchByColumn to iterate: " + desc.batchByColumn.getOrElse("") ) } @@ -335,7 +322,7 @@ abstract class SQLSourceOpExec( case Some(attribute) => var result: Number = null val preparedStatement = connection.prepareStatement( - "SELECT " + side + "(" + attribute.getName + ") FROM " + table + ";" + "SELECT " + side + "(" + attribute.getName + ") FROM " + desc.table + ";" ) val resultSet = preparedStatement.executeQuery resultSet.next @@ -410,7 +397,7 @@ abstract class SQLSourceOpExec( addFilterConditions(queryBuilder) // add sliding window if progressive mode is enabled - if (progressive.getOrElse(false) && batchByColumn.isDefined && interval > 0L) + if (desc.progressive.getOrElse(false) && desc.batchByColumn.isDefined && desc.interval > 0L) addBatchSlidingWindow(queryBuilder) // add limit if provided @@ -422,7 +409,7 @@ abstract class SQLSourceOpExec( } // add fixed offset if not progressive - if (!progressive.getOrElse(false) && curOffset.isDefined) addOffset(queryBuilder) + if (!desc.progressive.getOrElse(false) && curOffset.isDefined) addOffset(queryBuilder) // end terminateSQL(queryBuilder) @@ -450,7 +437,12 @@ abstract class SQLSourceOpExec( var curIndex = 1 // fill up the keywords - if (keywordSearch && keywordSearchByColumn != null && keywords != null) { + val keywords = desc.keywords.orNull + if ( + desc.keywordSearch.getOrElse( + false + ) && desc.keywordSearchByColumn.orNull != null && keywords != null + ) { preparedStatement.setString(curIndex, keywords) curIndex += 1 } @@ -464,7 +456,7 @@ abstract class SQLSourceOpExec( } // fill up offset if progressive mode is not enabled - if (!progressive.getOrElse(false)) + if (!desc.progressive.getOrElse(false)) curOffset match { case Some(offset) => preparedStatement.setLong(curIndex, offset) @@ -488,28 +480,28 @@ abstract class SQLSourceOpExec( @throws[IllegalArgumentException] private def initBatchColumnBoundaries(): Unit = { // TODO: add interval - if (batchByAttribute.isDefined && min.isDefined && max.isDefined) { + if (batchByAttribute.isDefined && desc.min.isDefined && desc.max.isDefined) { - if (min.get.equalsIgnoreCase("auto")) curLowerBound = fetchBatchByBoundary("MIN") + if (desc.min.get.equalsIgnoreCase("auto")) curLowerBound = fetchBatchByBoundary("MIN") else batchByAttribute.get.getType match { - case AttributeType.TIMESTAMP => curLowerBound = parseTimestamp(min.get).getTime - case AttributeType.LONG => curLowerBound = min.get.toLong + case AttributeType.TIMESTAMP => curLowerBound = parseTimestamp(desc.min.get).getTime + case AttributeType.LONG => curLowerBound = desc.min.get.toLong case _ => throw new IllegalArgumentException(s"Unsupported type ${batchByAttribute.get.getType}") } - if (max.get.equalsIgnoreCase("auto")) upperBound = fetchBatchByBoundary("MAX") + if (desc.max.get.equalsIgnoreCase("auto")) upperBound = fetchBatchByBoundary("MAX") else batchByAttribute.get.getType match { - case AttributeType.TIMESTAMP => upperBound = parseTimestamp(max.get).getTime - case AttributeType.LONG => upperBound = max.get.toLong + case AttributeType.TIMESTAMP => upperBound = parseTimestamp(desc.max.get).getTime + case AttributeType.LONG => upperBound = desc.max.get.toLong case _ => throw new IllegalArgumentException(s"Unsupported type ${batchByAttribute.get.getType}") } } else { throw new IllegalArgumentException( - s"Missing required progressive configuration, $batchByAttribute, $min or $max." + s"Missing required progressive configuration, $batchByAttribute, $desc.min or $desc.max." ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala index 8ab3249d909..6f688ae8e68 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala @@ -7,23 +7,24 @@ import com.fasterxml.jackson.annotation.{ } import com.fasterxml.jackson.databind.annotation.JsonDeserialize import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort import edu.uci.ics.amber.operator.filter.FilterPredicate -import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, AutofillAttributeNameList, UIWidget } -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.OutputPort +import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.sql.SQLSourceOpDesc import edu.uci.ics.amber.operator.source.sql.asterixdb.AsterixDBConnUtil.{ fetchDataTypeFields, queryAsterixDB } +import edu.uci.ics.amber.util.JSONUtils.objectMapper import kong.unirest.json.JSONObject @JsonIgnoreProperties(value = Array("username", "password")) @@ -97,32 +98,9 @@ class AsterixDBSourceOpDesc extends SQLSourceOpDesc { workflowId, executionId, this.operatorIdentifier, - OpExecInitInfo((_, _) => - new AsterixDBSourceOpExec( - host, - port, - database, - table, - limit, - offset, - progressive, - batchByColumn, - min, - max, - interval, - keywordSearch.getOrElse(false), - keywordSearchByColumn.orNull, - keywords.orNull, - geoSearch.getOrElse(false), - geoSearchByColumns, - geoSearchBoundingBox, - regexSearch.getOrElse(false), - regexSearchByColumn.orNull, - regex.orNull, - filterCondition.getOrElse(false), - filterPredicates, - () => sourceSchema() - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.sql.asterixdb.AsterixDBSourceOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) @@ -131,13 +109,6 @@ class AsterixDBSourceOpDesc extends SQLSourceOpDesc { SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> sourceSchema())) ) - override def sourceSchema(): Schema = { - if (this.host == null || this.port == null || this.database == null || this.table == null) - return null - - querySchema - } - override def operatorInfo: OperatorInfo = OperatorInfo( "AsterixDB Source", @@ -149,7 +120,11 @@ class AsterixDBSourceOpDesc extends SQLSourceOpDesc { override def updatePort(): Unit = port = if (port.trim().equals("default")) "19002" else port - override def querySchema: Schema = { + override def sourceSchema(): Schema = { + if (this.host == null || this.port == null || this.database == null || this.table == null) { + return null + } + updatePort() val sb: Schema.Builder = Schema.builder() diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpExec.scala index e950c65106d..88d49b052c7 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpExec.scala @@ -2,13 +2,13 @@ package edu.uci.ics.amber.operator.source.sql.asterixdb import com.github.tototoshi.csv.CSVParser import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.parseField -import edu.uci.ics.amber.core.tuple.{AttributeType, Schema, Tuple, TupleLike} -import edu.uci.ics.amber.operator.filter.FilterPredicate +import edu.uci.ics.amber.core.tuple.{AttributeType, Tuple, TupleLike} import edu.uci.ics.amber.operator.source.sql.SQLSourceOpExec import edu.uci.ics.amber.operator.source.sql.asterixdb.AsterixDBConnUtil.{ queryAsterixDB, updateAsterixDBVersionMapping } +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.sql._ import java.time.format.DateTimeFormatter @@ -17,44 +17,12 @@ import scala.util.control.Breaks.{break, breakable} import scala.util.{Failure, Success, Try} class AsterixDBSourceOpExec private[asterixdb] ( - host: String, - port: String, - database: String, - table: String, - limit: Option[Long], - offset: Option[Long], - progressive: Option[Boolean], - batchByColumn: Option[String], - min: Option[String], - max: Option[String], - interval: Long, - keywordSearch: Boolean, - keywordSearchByColumn: String, - keywords: String, - geoSearch: Boolean, - geoSearchByColumns: List[String], - geoSearchBoundingBox: List[String], - regexSearch: Boolean, - regexSearchByColumn: String, - regex: String, - filterCondition: Boolean, - filterPredicates: List[FilterPredicate], - schemaFunc: () => Schema -) extends SQLSourceOpExec( - table, - limit, - offset, - progressive, - batchByColumn, - min, - max, - interval, - keywordSearch, - keywordSearchByColumn, - keywords, - schemaFunc - ) { + descString: String +) extends SQLSourceOpExec(descString) { + override val desc: AsterixDBSourceOpDesc = + objectMapper.readValue(descString, classOf[AsterixDBSourceOpDesc]) + schema = desc.sourceSchema() // format Timestamp. TODO: move to some util package private val formatter: DateTimeFormatter = DateTimeFormatter.ISO_LOCAL_DATE_TIME.withZone(ZoneId.from(ZoneOffset.UTC)) @@ -64,7 +32,7 @@ class AsterixDBSourceOpExec private[asterixdb] ( override def open(): Unit = { // update AsterixDB API version upon open - updateAsterixDBVersionMapping(host, port) + updateAsterixDBVersionMapping(desc.host, desc.port) super.open() } @@ -133,7 +101,7 @@ class AsterixDBSourceOpExec private[asterixdb] ( curQueryString = if (hasNextQuery) generateSqlQuery else None curQueryString match { case Some(query) => - curResultIterator = queryAsterixDB(host, port, query) + curResultIterator = queryAsterixDB(desc.host, desc.port, query) break() case None => curResultIterator = None @@ -215,24 +183,26 @@ class AsterixDBSourceOpExec private[asterixdb] ( */ @throws[IllegalArgumentException] def addFilterConditions(queryBuilder: StringBuilder): Unit = { - if (keywordSearch) { + if (desc.keywordSearch.getOrElse(false)) { addKeywordSearch(queryBuilder) } - if (regexSearch) { + if (desc.regexSearch.getOrElse(false)) { addRegexSearch(queryBuilder) } - if (geoSearch) { + if (desc.geoSearch.getOrElse(false)) { addGeoSearch(queryBuilder) } - if (filterCondition) { + if (desc.filterCondition.getOrElse(false)) { addGeneralFilterCondition(queryBuilder) } } private def addKeywordSearch(queryBuilder: StringBuilder): Unit = { + val keywordSearchByColumn = desc.keywordSearchByColumn.orNull + val keywords = desc.keywords.orNull if (keywordSearchByColumn != null && keywords != null) { val columnType = schema.getAttribute(keywordSearchByColumn).getType if (columnType == AttributeType.STRING) { @@ -243,6 +213,8 @@ class AsterixDBSourceOpExec private[asterixdb] ( } private def addRegexSearch(queryBuilder: StringBuilder): Unit = { + val regexSearchByColumn = desc.regexSearchByColumn.orNull + val regex = desc.regex.orNull if (regexSearchByColumn != null && regex != null) { val regexColumnType = schema.getAttribute(regexSearchByColumn).getType if (regexColumnType == AttributeType.STRING) { @@ -256,17 +228,17 @@ class AsterixDBSourceOpExec private[asterixdb] ( private def addGeoSearch(queryBuilder: StringBuilder): Unit = { // geolocation must contain more than 1 points to from a rectangle or polygon - if (geoSearchBoundingBox.size > 1 && geoSearchByColumns.nonEmpty) { + if (desc.geoSearchBoundingBox.size > 1 && desc.geoSearchByColumns.nonEmpty) { val shape = { - val points = geoSearchBoundingBox.flatMap(s => s.split(",").map(sub => sub.toDouble)) - if (geoSearchBoundingBox.size == 2) { + val points = desc.geoSearchBoundingBox.flatMap(s => s.split(",").map(sub => sub.toDouble)) + if (desc.geoSearchBoundingBox.size == 2) { "create_rectangle(create_point(%.6f,%.6f), create_point(%.6f,%.6f))".format(points: _*) } else { "create_polygon([" + points.map(x => "%.6f".format(x)).mkString(",") + "])" } } queryBuilder ++= " AND (" - queryBuilder ++= geoSearchByColumns + queryBuilder ++= desc.geoSearchByColumns .map { attr => s"spatial_intersect($attr, $shape)" } .mkString(" OR ") queryBuilder ++= " ) " @@ -274,8 +246,8 @@ class AsterixDBSourceOpExec private[asterixdb] ( } private def addGeneralFilterCondition(queryBuilder: StringBuilder): Unit = { - if (filterCondition && filterPredicates.nonEmpty) { - val filterString = filterPredicates + if (desc.filterCondition.getOrElse(false) && desc.filterPredicates.nonEmpty) { + val filterString = desc.filterPredicates .map(p => s"(${p.attribute} ${p.condition.getName} ${p.value})") .mkString(" OR ") queryBuilder ++= s" AND ( $filterString ) " @@ -292,9 +264,9 @@ class AsterixDBSourceOpExec private[asterixdb] ( batchByAttribute match { case Some(attribute) => val resultString = queryAsterixDB( - host, - port, - "SELECT " + side + "(" + attribute.getName + ") FROM " + database + "." + table + ";" + desc.host, + desc.port, + "SELECT " + side + "(" + attribute.getName + ") FROM " + desc.database + "." + desc.table + ";" ).get.next().toString.stripLineEnd Try( parseField( @@ -317,7 +289,7 @@ class AsterixDBSourceOpExec private[asterixdb] ( .map((entry: (String, Int)) => { s"if_missing(${entry._1},null) field_${entry._2}" }) - .mkString(", ")} FROM $database.$table WHERE 1 = 1 " + .mkString(", ")} FROM $desc.database.$desc.table WHERE 1 = 1 " } override def addLimit(queryBuilder: StringBuilder): Unit = { @@ -342,7 +314,7 @@ class AsterixDBSourceOpExec private[asterixdb] ( } case None => throw new IllegalArgumentException( - "No valid batchByColumn to iterate: " + batchByColumn.getOrElse("") + "No valid batchByColumn to iterate: " + desc.batchByColumn.getOrElse("") ) } } @@ -353,7 +325,8 @@ class AsterixDBSourceOpExec private[asterixdb] ( */ override protected def loadTableNames(): Unit = { // fetch for all tables, it is also equivalent to a health check - val tables = queryAsterixDB(host, port, "select `DatasetName` from Metadata.`Dataset`;") + val tables = + queryAsterixDB(desc.host, desc.port, "select `DatasetName` from Metadata.`Dataset`;") tables.get.foreach(table => { tableNames.append(table.toString.stripPrefix("\"").stripLineEnd.stripSuffix("\"")) }) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpDesc.scala index 073e900e658..fbc583d8b68 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpDesc.scala @@ -1,10 +1,11 @@ package edu.uci.ics.amber.operator.source.sql.mysql -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.sql.SQLSourceOpDesc import edu.uci.ics.amber.operator.source.sql.mysql.MySQLConnUtil.connect +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.OutputPort @@ -21,26 +22,9 @@ class MySQLSourceOpDesc extends SQLSourceOpDesc { workflowId, executionId, this.operatorIdentifier, - OpExecInitInfo((_, _) => - new MySQLSourceOpExec( - host, - port, - database, - table, - username, - password, - limit, - offset, - progressive, - batchByColumn, - min, - max, - interval, - keywordSearch.getOrElse(false), - keywordSearchByColumn.orNull, - keywords.orNull, - () => sourceSchema() - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.sql.mysql.MySQLSourceOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpExec.scala index 56e6e25d008..9b42ad43ac0 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/mysql/MySQLSourceOpExec.scala @@ -1,53 +1,31 @@ package edu.uci.ics.amber.operator.source.sql.mysql -import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.AttributeType import edu.uci.ics.amber.operator.source.sql.SQLSourceOpExec import edu.uci.ics.amber.operator.source.sql.mysql.MySQLConnUtil.connect +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.sql._ class MySQLSourceOpExec private[mysql] ( - host: String, - port: String, - database: String, - table: String, - username: String, - password: String, - limit: Option[Long], - offset: Option[Long], - progressive: Option[Boolean], - batchByColumn: Option[String], - min: Option[String], - max: Option[String], - interval: Long, - keywordSearch: Boolean, - keywordSearchByColumn: String, - keywords: String, - schemaFunc: () => Schema -) extends SQLSourceOpExec( - table, - limit, - offset, - progressive, - batchByColumn, - min, - max, - interval, - keywordSearch, - keywordSearchByColumn, - keywords, - schemaFunc - ) { - + descString: String +) extends SQLSourceOpExec(descString) { + override val desc: MySQLSourceOpDesc = + objectMapper.readValue(descString, classOf[MySQLSourceOpDesc]) + schema = desc.sourceSchema() val FETCH_TABLE_NAMES_SQL = "SELECT table_name FROM information_schema.tables WHERE table_schema = ?;" @throws[SQLException] - override def establishConn(): Connection = connect(host, port, database, username, password) + override def establishConn(): Connection = + connect(desc.host, desc.port, desc.database, desc.username, desc.password) @throws[RuntimeException] override def addFilterConditions(queryBuilder: StringBuilder): Unit = { - if (keywordSearch && keywordSearchByColumn != null && keywords != null) { + val keywordSearchByColumn = desc.keywordSearchByColumn.orNull + if ( + desc.keywordSearch.getOrElse(false) && keywordSearchByColumn != null && desc.keywords != null + ) { val columnType = schema.getAttribute(keywordSearchByColumn).getType if (columnType == AttributeType.STRING) @@ -61,7 +39,7 @@ class MySQLSourceOpExec private[mysql] ( @throws[SQLException] override protected def loadTableNames(): Unit = { val preparedStatement = connection.prepareStatement(FETCH_TABLE_NAMES_SQL) - preparedStatement.setString(1, database) + preparedStatement.setString(1, desc.database) val resultSet = preparedStatement.executeQuery while ({ resultSet.next diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpDesc.scala index 4abc00c2c6b..529ec85d971 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpDesc.scala @@ -3,12 +3,13 @@ package edu.uci.ics.amber.operator.source.sql.postgresql import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.fasterxml.jackson.databind.annotation.JsonDeserialize import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.sql.SQLSourceOpDesc import edu.uci.ics.amber.operator.source.sql.postgresql.PostgreSQLConnUtil.connect +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.OutputPort @@ -34,26 +35,9 @@ class PostgreSQLSourceOpDesc extends SQLSourceOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => - new PostgreSQLSourceOpExec( - host, - port, - database, - table, - username, - password, - limit, - offset, - progressive, - batchByColumn, - min, - max, - interval, - keywordSearch.getOrElse(false), - keywordSearchByColumn.orNull, - keywords.orNull, - () => sourceSchema() - ) + OpExecWithClassName( + "edu.uci.ics.amber.operator.source.sql.postgresql.PostgreSQLSourceOpExec", + objectMapper.writeValueAsString(this) ) ) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpExec.scala index 73223d3e5ab..05a2a29067b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/postgresql/PostgreSQLSourceOpExec.scala @@ -1,52 +1,30 @@ package edu.uci.ics.amber.operator.source.sql.postgresql -import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.AttributeType import edu.uci.ics.amber.operator.source.sql.SQLSourceOpExec import edu.uci.ics.amber.operator.source.sql.postgresql.PostgreSQLConnUtil.connect +import edu.uci.ics.amber.util.JSONUtils.objectMapper import java.sql._ -class PostgreSQLSourceOpExec private[postgresql] ( - host: String, - port: String, - database: String, - table: String, - username: String, - password: String, - limit: Option[Long], - offset: Option[Long], - progressive: Option[Boolean], - batchByColumn: Option[String], - min: Option[String], - max: Option[String], - interval: Long, - keywordSearch: Boolean, - keywordSearchByColumn: String, - keywords: String, - schemaFunc: () => Schema -) extends SQLSourceOpExec( - table, - limit, - offset, - progressive, - batchByColumn, - min, - max, - interval, - keywordSearch, - keywordSearchByColumn, - keywords, - schemaFunc - ) { +class PostgreSQLSourceOpExec private[postgresql] (descString: String) + extends SQLSourceOpExec(descString) { + override val desc: PostgreSQLSourceOpDesc = + objectMapper.readValue(descString, classOf[PostgreSQLSourceOpDesc]) + schema = desc.sourceSchema() val FETCH_TABLE_NAMES_SQL = "SELECT table_name FROM information_schema.tables WHERE table_type='BASE TABLE';" @throws[SQLException] - override def establishConn(): Connection = connect(host, port, database, username, password) + override def establishConn(): Connection = + connect(desc.host, desc.port, desc.database, desc.username, desc.password) @throws[RuntimeException] override def addFilterConditions(queryBuilder: StringBuilder): Unit = { - if (keywordSearch && keywordSearchByColumn != null && keywords != null) { + val keywordSearchByColumn = desc.keywordSearchByColumn.orNull + if ( + desc.keywordSearch.getOrElse(false) && keywordSearchByColumn != null && desc.keywords != null + ) { val columnType = schema.getAttribute(keywordSearchByColumn).getType if (columnType == AttributeType.STRING) { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala index 1aab480e7d5..134af1029cd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala @@ -2,11 +2,12 @@ package edu.uci.ics.amber.operator.split import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} @@ -30,7 +31,10 @@ class SplitOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new SplitOpExec(k, seed)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.split.SplitOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpExec.scala index 848982a1e20..a0f3544e8de 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpExec.scala @@ -2,22 +2,22 @@ package edu.uci.ics.amber.operator.split import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.workflow.PortIdentity import scala.util.Random class SplitOpExec( - k: Int, - seed: Int + descString: String ) extends OperatorExecutor { - - lazy val random = new Random(seed) + val desc: SplitOpDesc = objectMapper.readValue(descString, classOf[SplitOpDesc]) + lazy val random = new Random(desc.seed) override def processTupleMultiPort( tuple: Tuple, port: Int ): Iterator[(TupleLike, Option[PortIdentity])] = { - val isTraining = random.nextInt(100) < k + val isTraining = random.nextInt(100) < desc.k // training output port: 0, testing output port: 1 val port = if (isTraining) PortIdentity(0) else PortIdentity(1) Iterator.single((tuple, Some(port))) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala index 94a2ac3b852..e77663fdf0b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.operator.symmetricDifference import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} import edu.uci.ics.amber.operator.LogicalOp @@ -21,7 +21,9 @@ class SymmetricDifferenceOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new SymmetricDifferenceOpExec()) + OpExecWithClassName( + "edu.uci.ics.amber.operator.symmetricDifference.SymmetricDifferenceOpExec" + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala index eca814f491c..b52f299c0ff 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala @@ -2,11 +2,12 @@ package edu.uci.ics.amber.operator.typecasting import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{AttributeTypeUtils, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} @@ -27,7 +28,10 @@ class TypeCastingOpDesc extends MapOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new TypeCastingOpExec(typeCastingUnits)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.typecasting.TypeCastingOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExec.scala index 998d0504583..821c76c02cc 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExec.scala @@ -2,14 +2,19 @@ package edu.uci.ics.amber.operator.typecasting import edu.uci.ics.amber.core.tuple.{AttributeTypeUtils, Tuple, TupleLike} import edu.uci.ics.amber.operator.map.MapOpExec +import edu.uci.ics.amber.util.JSONUtils.objectMapper + +class TypeCastingOpExec(descString: String) extends MapOpExec { + + private val desc: TypeCastingOpDesc = + objectMapper.readValue(descString, classOf[TypeCastingOpDesc]) -class TypeCastingOpExec(typeCastingUnits: List[TypeCastingUnit]) extends MapOpExec { this.setMapFunc(castTuple) private def castTuple(tuple: Tuple): TupleLike = AttributeTypeUtils.tupleCasting( tuple, - typeCastingUnits + desc.typeCastingUnits .map(typeCastingUnit => typeCastingUnit.attribute -> typeCastingUnit.resultType) .toMap ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala index a3fa40a4e01..9fe0089c4ba 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.udf.java import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithCode import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{ PartitionInfo, @@ -95,7 +95,7 @@ class JavaUDFOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo(code, "java") + OpExecWithCode(code, "java") ) .withDerivePartition(_ => UnknownPartition()) .withInputPorts(operatorInfo.inputPorts) @@ -111,7 +111,7 @@ class JavaUDFOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo(code, "java") + OpExecWithCode(code, "java") ) .withDerivePartition(_ => UnknownPartition()) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala index 24d6bb62549..985fa54fede 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.udf.python import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithCode import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc, UnknownPartition} import edu.uci.ics.amber.operator.LogicalOp @@ -70,7 +70,7 @@ class DualInputPortsPythonUDFOpDescV2 extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo(code, "python") + OpExecWithCode(code, "python") ) .withDerivePartition(_ => UnknownPartition()) .withParallelizable(true) @@ -88,7 +88,7 @@ class DualInputPortsPythonUDFOpDescV2 extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo(code, "python") + OpExecWithCode(code, "python") ) .withDerivePartition(_ => UnknownPartition()) .withParallelizable(false) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala index 3ce08b1510a..1f9b69eb326 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.udf.python import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithCode import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{ PartitionInfo, @@ -104,7 +104,7 @@ class PythonUDFOpDescV2 extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo(code, "python") + OpExecWithCode(code, "python") ) .withDerivePartition(_ => UnknownPartition()) .withInputPorts(operatorInfo.inputPorts) @@ -120,7 +120,7 @@ class PythonUDFOpDescV2 extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo(code, "python") + OpExecWithCode(code, "python") ) .withDerivePartition(_ => UnknownPartition()) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala index 3086d8e6762..086b014ea68 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.udf.python.source import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithCode import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -40,7 +40,6 @@ class PythonUDFSourceOpDescV2 extends SourceOperatorDescriptor { workflowId: WorkflowIdentity, executionId: ExecutionIdentity ): PhysicalOp = { - val exec = OpExecInitInfo(code, "python") require(workers >= 1, "Need at least 1 worker.") val func = SchemaPropagationFunc { _: Map[PortIdentity, Schema] => @@ -49,7 +48,7 @@ class PythonUDFSourceOpDescV2 extends SourceOperatorDescriptor { } val physicalOp = PhysicalOp - .sourcePhysicalOp(workflowId, executionId, operatorIdentifier, exec) + .sourcePhysicalOp(workflowId, executionId, operatorIdentifier, OpExecWithCode(code, "python")) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withIsOneToManyOp(true) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala index 94f31d02f05..42445e21e16 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.udf.r import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithCode import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{ PartitionInfo, @@ -97,7 +97,7 @@ class RUDFOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo(code, r_operator_type) + OpExecWithCode(code, r_operator_type) ) .withDerivePartition(_ => UnknownPartition()) .withInputPorts(operatorInfo.inputPorts) @@ -113,7 +113,7 @@ class RUDFOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo(code, r_operator_type) + OpExecWithCode(code, r_operator_type) ) .withDerivePartition(_ => UnknownPartition()) .withInputPorts(operatorInfo.inputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala index afb2e2524e4..0653228a145 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.udf.r import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithCode import edu.uci.ics.amber.core.tuple.{Attribute, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -49,7 +49,6 @@ class RUDFSourceOpDesc extends SourceOperatorDescriptor { executionId: ExecutionIdentity ): PhysicalOp = { val rOperatorType = if (useTupleAPI) "r-tuple" else "r-table" - val exec = OpExecInitInfo(code, rOperatorType) require(workers >= 1, "Need at least 1 worker.") val func = SchemaPropagationFunc { _: Map[PortIdentity, Schema] => @@ -58,7 +57,12 @@ class RUDFSourceOpDesc extends SourceOperatorDescriptor { } val physicalOp = PhysicalOp - .sourcePhysicalOp(workflowId, executionId, operatorIdentifier, exec) + .sourcePhysicalOp( + workflowId, + executionId, + operatorIdentifier, + OpExecWithCode(code, rOperatorType) + ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withIsOneToManyOp(true) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala index 6e6efcc1d0c..7e75c24e7f6 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.operator.union import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.LogicalOp @@ -20,7 +20,7 @@ class UnionOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new UnionOpExec()) + OpExecWithClassName("edu.uci.ics.amber.operator.union.UnionOpExec") ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala index 26b36b410dc..5ac736490da 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala @@ -2,12 +2,13 @@ package edu.uci.ics.amber.operator.unneststring import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.flatmap.FlatMapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -44,7 +45,10 @@ class UnnestStringOpDesc extends FlatMapOpDesc { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new UnnestStringOpExec(attribute, delimiter)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.unneststring.UnnestStringOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExec.scala index 09962de8d45..084d1dc0836 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExec.scala @@ -2,14 +2,16 @@ package edu.uci.ics.amber.operator.unneststring import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} import edu.uci.ics.amber.operator.flatmap.FlatMapOpExec +import edu.uci.ics.amber.util.JSONUtils.objectMapper -class UnnestStringOpExec(attributeName: String, delimiter: String) extends FlatMapOpExec { - +class UnnestStringOpExec(descString: String) extends FlatMapOpExec { + private val desc: UnnestStringOpDesc = + objectMapper.readValue(descString, classOf[UnnestStringOpDesc]) setFlatMapFunc(splitByDelimiter) private def splitByDelimiter(tuple: Tuple): Iterator[TupleLike] = { - delimiter.r - .split(tuple.getField(attributeName).toString) + desc.delimiter.r + .split(tuple.getField(desc.attribute).toString) .filter(_.nonEmpty) .iterator .map(split => TupleLike(tuple.getFields ++ Seq(split))) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala index 5f2696c3cb9..2bf48a41ca8 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala @@ -2,12 +2,13 @@ package edu.uci.ics.amber.operator.visualization.htmlviz import com.fasterxml.jackson.annotation.JsonProperty import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @@ -30,7 +31,10 @@ class HtmlVizOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new HtmlVizOpExec(htmlContentAttrName)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.visualization.htmlviz.HtmlVizOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExec.scala index 1177cef30ab..e269803a83d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExec.scala @@ -2,11 +2,13 @@ package edu.uci.ics.amber.operator.visualization.htmlviz import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} +import edu.uci.ics.amber.util.JSONUtils.objectMapper /** * HTML Visualization operator to render any given HTML code */ -class HtmlVizOpExec(htmlContentAttrName: String) extends OperatorExecutor { +class HtmlVizOpExec(descString: String) extends OperatorExecutor { + private val desc: HtmlVizOpDesc = objectMapper.readValue(descString, classOf[HtmlVizOpDesc]) override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = - Iterator(TupleLike(tuple.getField[Any](htmlContentAttrName))) + Iterator(TupleLike(tuple.getField[Any](desc.htmlContentAttrName))) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala index 7fe381c2d28..9df368b0ec9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.urlviz import com.fasterxml.jackson.annotation.JsonProperty import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.executor.OpExecInitInfo +import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp @@ -10,6 +10,7 @@ import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdenti import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName +import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode /** @@ -30,7 +31,7 @@ class UrlVizOpDesc extends LogicalOp { @JsonProperty(required = true) @JsonSchemaTitle("URL content") @AutofillAttributeName - private val urlContentAttrName: String = "" + val urlContentAttrName: String = "" override def getPhysicalOp( workflowId: WorkflowIdentity, @@ -41,7 +42,10 @@ class UrlVizOpDesc extends LogicalOp { workflowId, executionId, operatorIdentifier, - OpExecInitInfo((_, _) => new UrlVizOpExec(urlContentAttrName)) + OpExecWithClassName( + "edu.uci.ics.amber.operator.visualization.urlviz.UrlVizOpExec", + objectMapper.writeValueAsString(this) + ) ) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpExec.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpExec.scala index 32f05d7d6ef..88e09839861 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpExec.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpExec.scala @@ -2,19 +2,20 @@ package edu.uci.ics.amber.operator.visualization.urlviz import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.{Tuple, TupleLike} +import edu.uci.ics.amber.util.JSONUtils.objectMapper /** * URL Visualization operator to render any given URL link */ -class UrlVizOpExec(urlContentAttrName: String) extends OperatorExecutor { - +class UrlVizOpExec(descString: String) extends OperatorExecutor { + private val desc: UrlVizOpDesc = objectMapper.readValue(descString, classOf[UrlVizOpDesc]) override def processTuple(tuple: Tuple, port: Int): Iterator[TupleLike] = { val iframe = s""" | | | diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala index 6ffa101eb87..f952d847e7f 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.dictionary import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema, SchemaEnforceable, Tuple} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec @@ -23,14 +24,13 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .build() var opExec: DictionaryMatcherOpExec = _ - var opDesc: DictionaryMatcherOpDesc = _ + val opDesc: DictionaryMatcherOpDesc = new DictionaryMatcherOpDesc() var outputSchema: Schema = _ val dictionaryScan = "nice a a person" val dictionarySubstring = "nice a a person and good" val dictionaryConjunction = "a person is nice" before { - opDesc = new DictionaryMatcherOpDesc() opDesc.attribute = "field1" opDesc.dictionary = dictionaryScan opDesc.resultAttribute = "matched" @@ -39,7 +39,7 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "open" in { - opExec = new DictionaryMatcherOpExec(opDesc.attribute, opDesc.dictionary, opDesc.matchingType) + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() assert(opExec.dictionaryEntries != null) } @@ -48,8 +48,8 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { * Test cases that all Matching Types should match the query */ it should "match a tuple if present in the given dictionary entry when matching type is SCANBASED" in { - opExec = - new DictionaryMatcherOpExec(opDesc.attribute, opDesc.dictionary, MatchingType.SCANBASED) + opDesc.matchingType = MatchingType.SCANBASED + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val processedTuple = opExec.processTuple(tuple, 0).next() assert( @@ -60,6 +60,7 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { it should "match a tuple if present in the given dictionary entry when matching type is SUBSTRING" in { opDesc.matchingType = MatchingType.SUBSTRING + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val processedTuple = opExec.processTuple(tuple, 0).next() assert( @@ -70,6 +71,7 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { it should "match a tuple if present in the given dictionary entry when matching type is CONJUNCTION_INDEXBASED" in { opDesc.matchingType = MatchingType.CONJUNCTION_INDEXBASED + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val processedTuple = opExec.processTuple(tuple, 0).next() assert( @@ -82,8 +84,9 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { * Test cases that SCANBASED and SUBSTRING Matching Types should fail to match a query */ it should "not match a tuple if not present in the given dictionary entry when matching type is SCANBASED and not exact match" in { - opExec = - new DictionaryMatcherOpExec(opDesc.attribute, dictionaryConjunction, MatchingType.SCANBASED) + opDesc.dictionary = dictionaryConjunction + opDesc.matchingType = MatchingType.SCANBASED + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val processedTuple = opExec.processTuple(tuple, 0).next() assert( @@ -98,6 +101,7 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { it should "not match a tuple if the given dictionary entry doesn't contain all the tuple when the matching type is SUBSTRING" in { opDesc.dictionary = dictionaryConjunction opDesc.matchingType = MatchingType.SUBSTRING + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val processedTuple = opExec.processTuple(tuple, 0).next() assert( @@ -110,11 +114,9 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "match a tuple if present in the given dictionary entry when matching type is CONJUNCTION_INDEXBASED even with different order" in { - opExec = new DictionaryMatcherOpExec( - opDesc.attribute, - dictionaryConjunction, - MatchingType.CONJUNCTION_INDEXBASED - ) + opDesc.dictionary = dictionaryConjunction + opDesc.matchingType = MatchingType.CONJUNCTION_INDEXBASED + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val processedTuple = opExec.processTuple(tuple, 0).next() assert( @@ -130,8 +132,9 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { * Test cases that only SUBSTRING Matching Type should match the query */ it should "not match a tuple if not present in the given dictionary entry when matching type is SCANBASED when the entry contains more text" in { - opExec = - new DictionaryMatcherOpExec(opDesc.attribute, dictionarySubstring, MatchingType.SCANBASED) + opDesc.dictionary = dictionarySubstring + opDesc.matchingType = MatchingType.SCANBASED + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val processedTuple = opExec.processTuple(tuple, 0).next() assert( @@ -146,6 +149,7 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { it should "not match a tuple if not present in the given dictionary entry when matching type is CONJUNCTION_INDEXBASED when the entry contains more text" in { opDesc.dictionary = dictionarySubstring opDesc.matchingType = MatchingType.CONJUNCTION_INDEXBASED + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val processedTuple = opExec.processTuple(tuple, 0).next() assert( @@ -158,8 +162,9 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "match a tuple if not present in the given dictionary entry when matching type is SUBSTRING when the entry contains more text" in { - opExec = - new DictionaryMatcherOpExec(opDesc.attribute, dictionarySubstring, MatchingType.SUBSTRING) + opDesc.dictionary = dictionarySubstring + opDesc.matchingType = MatchingType.SUBSTRING + opExec = new DictionaryMatcherOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val processedTuple = opExec.processTuple(tuple, 0).next() assert( diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala index 2826630cee3..a17642c8286 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala @@ -1,12 +1,13 @@ package edu.uci.ics.amber.operator.filter import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema, Tuple} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { val inputPort: Int = 0 - + val opDesc: SpecializedFilterOpDesc = new SpecializedFilterOpDesc() val tuplesWithOneFieldNull: Iterable[Tuple] = AttributeType .values() @@ -44,40 +45,31 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .build() it should "open and close" in { - val opExec = new SpecializedFilterOpExec(List()) - opExec.open() - opExec.close() - } - - it should "throw when predicates is null" in { - val opExec = new SpecializedFilterOpExec(null) + opDesc.predicates = List() + val opExec = new SpecializedFilterOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() - assertThrows[NullPointerException] { - opExec.processTuple(allNullTuple, inputPort) - } opExec.close() } it should "do nothing when predicates is an empty list" in { - val opExec = new SpecializedFilterOpExec(List()) + opDesc.predicates = List() + val opExec = new SpecializedFilterOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() assert(opExec.processTuple(allNullTuple, inputPort).isEmpty) opExec.close() } it should "not have is_null comparisons be affected by values" in { - val opExec = new SpecializedFilterOpExec( - List(new FilterPredicate("string", ComparisonType.IS_NULL, "value")) - ) + opDesc.predicates = List(new FilterPredicate("string", ComparisonType.IS_NULL, "value")) + val opExec = new SpecializedFilterOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() assert(opExec.processTuple(allNullTuple, inputPort).nonEmpty) opExec.close() } it should "not have is_not_null comparisons be affected by values" in { - val opExec = new SpecializedFilterOpExec( - List(new FilterPredicate("string", ComparisonType.IS_NOT_NULL, "value")) - ) + opDesc.predicates = List(new FilterPredicate("string", ComparisonType.IS_NOT_NULL, "value")) + val opExec = new SpecializedFilterOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() assert(opExec.processTuple(allNullTuple, inputPort).isEmpty) opExec.close() @@ -88,11 +80,9 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .map(nullTuple => { val attributes = nullTuple.getSchema.getAttributes assert(attributes.length == 1) - - val opExec = new SpecializedFilterOpExec( + opDesc.predicates = List(new FilterPredicate(attributes.head.getName, ComparisonType.IS_NULL, null)) - ) - + val opExec = new SpecializedFilterOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() assert(opExec.processTuple(nullTuple, inputPort).nonEmpty) opExec.close() @@ -100,18 +90,16 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "filter out non null tuples when filtering is_null" in { - val opExec = new SpecializedFilterOpExec( - List(new FilterPredicate("string", ComparisonType.IS_NULL, "value")) - ) + opDesc.predicates = List(new FilterPredicate("string", ComparisonType.IS_NULL, "value")) + val opExec = new SpecializedFilterOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() assert(opExec.processTuple(nonNullTuple, inputPort).isEmpty) opExec.close() } it should "output non null tuples when filter is_not_null" in { - val opExec = new SpecializedFilterOpExec( - List(new FilterPredicate("string", ComparisonType.IS_NOT_NULL, "value")) - ) + opDesc.predicates = List(new FilterPredicate("string", ComparisonType.IS_NOT_NULL, "value")) + val opExec = new SpecializedFilterOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() assert(opExec.processTuple(nonNullTuple, inputPort).nonEmpty) opExec.close() @@ -122,11 +110,9 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .map(nullTuple => { val attributes = nullTuple.getSchema.getAttributes assert(attributes.length == 1) - - val opExec = new SpecializedFilterOpExec( + opDesc.predicates = List(new FilterPredicate(attributes.head.getName, ComparisonType.IS_NOT_NULL, null)) - ) - + val opExec = new SpecializedFilterOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() assert(opExec.processTuple(nullTuple, inputPort).isEmpty) opExec.close() diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala index b7ecb183fcc..2049d89f7f4 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala @@ -12,6 +12,7 @@ import edu.uci.ics.amber.core.tuple.{ TupleLike } import edu.uci.ics.amber.operator.hashJoin.HashJoinBuildOpExec +import edu.uci.ics.amber.util.JSONUtils.objectMapper class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { val build: Int = 0 val probe: Int = 1 @@ -51,10 +52,11 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { opDesc = new HashJoinOpDesc[String]() opDesc.buildAttributeName = "build_1" opDesc.probeAttributeName = "probe_1" + opDesc.joinType = JoinType.INNER val inputSchemas = Array(schema("build"), schema("probe")) val outputSchema = opDesc.getOutputSchema(inputSchemas) - buildOpExec = new HashJoinBuildOpExec[String]("build_1") + buildOpExec = new HashJoinBuildOpExec[String](objectMapper.writeValueAsString(opDesc)) buildOpExec.open() (0 to 7).map(i => { @@ -67,10 +69,7 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { buildOpExec.onFinish(build) assert(buildOpOutputIterator.hasNext) - probeOpExec = new HashJoinProbeOpExec[String]( - "probe_1", - JoinType.INNER - ) + probeOpExec = new HashJoinProbeOpExec[String](objectMapper.writeValueAsString(opDesc)) probeOpExec.open() while (buildOpOutputIterator.hasNext) { @@ -109,10 +108,11 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { opDesc = new HashJoinOpDesc[String]() opDesc.buildAttributeName = "same" opDesc.probeAttributeName = "same" + opDesc.joinType = JoinType.INNER val inputSchemas = Array(schema("same", 1), schema("same", 2)) val outputSchema = opDesc.getOutputSchema(inputSchemas) - buildOpExec = new HashJoinBuildOpExec[String]("same") + buildOpExec = new HashJoinBuildOpExec[String](objectMapper.writeValueAsString(opDesc)) buildOpExec.open() (0 to 7).map(i => { @@ -124,11 +124,7 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { buildOpExec.onFinish(build) assert(buildOpOutputIterator.hasNext) - probeOpExec = new HashJoinProbeOpExec[String]( - "same", - JoinType.INNER - ) - + probeOpExec = new HashJoinProbeOpExec[String](objectMapper.writeValueAsString(opDesc)) probeOpExec.open() while (buildOpOutputIterator.hasNext) { @@ -166,10 +162,11 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { opDesc = new HashJoinOpDesc[String]() opDesc.buildAttributeName = "same" opDesc.probeAttributeName = "same" + opDesc.joinType = JoinType.FULL_OUTER val inputSchemas = Array(schema("same", 1), schema("same", 2)) val outputSchema = opDesc.getOutputSchema(inputSchemas) - buildOpExec = new HashJoinBuildOpExec[String]("same") + buildOpExec = new HashJoinBuildOpExec[String](objectMapper.writeValueAsString(opDesc)) buildOpExec.open() (0 to 7).map(i => { @@ -181,11 +178,7 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { buildOpExec.onFinish(build) assert(buildOpOutputIterator.hasNext) - probeOpExec = new HashJoinProbeOpExec[String]( - "same", - JoinType.FULL_OUTER - ) - + probeOpExec = new HashJoinProbeOpExec[String](objectMapper.writeValueAsString(opDesc)) probeOpExec.open() while (buildOpOutputIterator.hasNext) { diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala index c21d2308791..72c062ed319 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala @@ -16,6 +16,7 @@ import edu.uci.ics.amber.core.tuple.{ Tuple, TupleLike } +import edu.uci.ics.amber.util.JSONUtils.objectMapper class IntervalOpExecSpec extends AnyFlatSpec with BeforeAndAfter { val left: Int = 0 val right: Int = 1 @@ -222,14 +223,7 @@ class IntervalOpExecSpec extends AnyFlatSpec with BeforeAndAfter { timeIntervalType ) val outputSchema = opDesc.getOutputSchema(inputSchemas) - val opExec = new IntervalJoinOpExec( - leftAttributeName = leftKey, - rightAttributeName = rightKey, - includeLeftBound = includeLeftBound, - includeRightBound = includeRightBound, - constant = intervalConstant, - timeIntervalType = Some(timeIntervalType) - ) + val opExec = new IntervalJoinOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() counter = 0 var leftIndex: Int = 0 @@ -400,15 +394,13 @@ class IntervalOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "work with Double value int [] interval" in { - val opExec = new IntervalJoinOpExec( - leftAttributeName = "point_1", - rightAttributeName = "range_1", - includeLeftBound = true, - includeRightBound = true, - constant = 3, - timeIntervalType = Option(TimeIntervalType.DAY) - ) - + opDesc.leftAttributeName = "point_1" + opDesc.rightAttributeName = "range_1" + opDesc.includeLeftBound = true + opDesc.includeRightBound = true + opDesc.constant = 3 + opDesc.timeIntervalType = Option(TimeIntervalType.DAY) + val opExec = new IntervalJoinOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() counter = 0 val pointList: Array[Double] = Array(1.1, 2.1, 3.1, 4.1, 5.1, 6.1, 7.1, 8.1, 9.1, 10.1) diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExecSpec.scala index 50dcfc33515..c0e12804e7c 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExecSpec.scala @@ -1,11 +1,13 @@ package edu.uci.ics.amber.operator.keywordSearch import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema, Tuple} +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { val inputPort: Int = 0 + val opDesc: KeywordSearchOpDesc = new KeywordSearchOpDesc() val schema: Schema = Schema .builder() @@ -35,7 +37,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { ) it should "find exact match with single number" in { - val opExec = new KeywordSearchOpExec("text", "3") + opDesc.attribute = "text" + opDesc.keyword = "3" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).nonEmpty) assert(results.length == 1) @@ -44,7 +48,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find exact phrase match" in { - val opExec = new KeywordSearchOpExec("text", "\"3 stars\"") + opDesc.attribute = "text" + opDesc.keyword = "\"3 stars\"" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).nonEmpty) assert(results.length == 1) @@ -53,7 +59,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find all occurrences of Trump" in { - val opExec = new KeywordSearchOpExec("text", "Trump") + opDesc.attribute = "text" + opDesc.keyword = "Trump" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).nonEmpty) assert(results.length == 2) @@ -62,7 +70,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find all occurrences of Biden" in { - val opExec = new KeywordSearchOpExec("text", "Biden") + opDesc.attribute = "text" + opDesc.keyword = "Biden" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).nonEmpty) assert(results.length == 1) @@ -71,7 +81,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find records containing both Trump AND Biden" in { - val opExec = new KeywordSearchOpExec("text", "Trump AND Biden") + opDesc.attribute = "text" + opDesc.keyword = "Trump AND Biden" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).nonEmpty) assert(results.length == 1) @@ -80,7 +92,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find no matches for exact phrase 'Trump AND Biden'" in { - val opExec = new KeywordSearchOpExec("text", "\"Trump AND Biden\"") + opDesc.attribute = "text" + opDesc.keyword = "\"Trump AND Biden\"" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.isEmpty) @@ -88,7 +102,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find no matches for partial word 'ell'" in { - val opExec = new KeywordSearchOpExec("text", "ell") + opDesc.attribute = "text" + opDesc.keyword = "ell" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.isEmpty) @@ -96,7 +112,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find exact match for word 'the'" in { - val opExec = new KeywordSearchOpExec("text", "the") + opDesc.attribute = "text" + opDesc.keyword = "the" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.length == 1) @@ -105,7 +123,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find exact match for word 'an'" in { - val opExec = new KeywordSearchOpExec("text", "an") + opDesc.attribute = "text" + opDesc.keyword = "an" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.length == 1) @@ -114,7 +134,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find exact match for word 'to'" in { - val opExec = new KeywordSearchOpExec("text", "to") + opDesc.attribute = "text" + opDesc.keyword = "to" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.length == 1) @@ -123,7 +145,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find case-insensitive match for 'twitter'" in { - val opExec = new KeywordSearchOpExec("text", "twitter") + opDesc.attribute = "text" + opDesc.keyword = "twitter" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.length == 1) @@ -132,7 +156,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find exact match for Korean text '안녕하세요'" in { - val opExec = new KeywordSearchOpExec("text", "안녕하세요") + opDesc.attribute = "text" + opDesc.keyword = "안녕하세요" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.length == 1) @@ -141,7 +167,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find exact match for Chinese text '你好'" in { - val opExec = new KeywordSearchOpExec("text", "你好") + opDesc.attribute = "text" + opDesc.keyword = "你好" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.length == 1) @@ -150,7 +178,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find no matches for special character '@'" in { - val opExec = new KeywordSearchOpExec("text", "@") + opDesc.attribute = "text" + opDesc.keyword = "@" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.isEmpty) @@ -158,7 +188,9 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "find exact match for special characters '_!@,-'" in { - val opExec = new KeywordSearchOpExec("text", "_!@,-") + opDesc.attribute = "text" + opDesc.keyword = "_!@,-" + val opExec = new KeywordSearchOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() val results = testData.filter(t => opExec.processTuple(t, inputPort).hasNext) assert(results.isEmpty) diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExecSpec.scala index 0f0a420cf79..edd889734d7 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExecSpec.scala @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.projection import edu.uci.ics.amber.core.tuple._ +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class ProjectionOpExecSpec extends AnyFlatSpec with BeforeAndAfter { @@ -20,31 +21,30 @@ class ProjectionOpExecSpec extends AnyFlatSpec with BeforeAndAfter { true ) .build() + val opDesc: ProjectionOpDesc = new ProjectionOpDesc() it should "open" in { - val projectionOpExec = new ProjectionOpExec( - List( - new AttributeUnit("field2", "f2"), - new AttributeUnit("field1", "f1") - ) + opDesc.attributes = List( + new AttributeUnit("field2", "f2"), + new AttributeUnit("field1", "f1") ) + val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) projectionOpExec.open() } it should "process Tuple" in { + opDesc.attributes = List( + new AttributeUnit("field2", "f2"), + new AttributeUnit("field1", "f1") + ) val outputSchema = Schema .builder() .add(new Attribute("f1", AttributeType.STRING)) .add(new Attribute("f2", AttributeType.INTEGER)) .build() - val projectionOpExec = new ProjectionOpExec( - List( - new AttributeUnit("field2", "f2"), - new AttributeUnit("field1", "f1") - ) - ) + val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) projectionOpExec.open() val outputTuple = @@ -59,20 +59,18 @@ class ProjectionOpExecSpec extends AnyFlatSpec with BeforeAndAfter { assert(outputTuple.getField[String](0) == "hello") assert(outputTuple.getField[Int](1) == 1) } - it should "process Tuple with different order" in { + opDesc.attributes = List( + new AttributeUnit("field3", "f3"), + new AttributeUnit("field1", "f1") + ) val outputSchema = Schema .builder() .add(new Attribute("f3", AttributeType.BOOLEAN)) .add(new Attribute("f1", AttributeType.STRING)) .build() - val projectionOpExec = new ProjectionOpExec( - List( - new AttributeUnit("field3", "f3"), - new AttributeUnit("field1", "f1") - ) - ) + val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) projectionOpExec.open() val outputTuple = @@ -88,54 +86,48 @@ class ProjectionOpExecSpec extends AnyFlatSpec with BeforeAndAfter { assert(outputTuple.getField[String](1) == "hello") } - it should "raise RuntimeException on non-existing fields" in { - val projectionOpExec = new ProjectionOpExec( - List( - new AttributeUnit("field---5", "f5"), - new AttributeUnit("field---6", "f6") - ) + it should "meException on non-existing fields" in { + opDesc.attributes = List( + new AttributeUnit("field---5", "f5"), + new AttributeUnit("field---6", "f6") ) + val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) assertThrows[RuntimeException] { projectionOpExec.processTuple(tuple, 0).next() } - } it should "raise IllegalArgumentException on empty attributes" in { - val projectionOpExec = new ProjectionOpExec(List()) + opDesc.attributes = List() + val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) assertThrows[IllegalArgumentException] { projectionOpExec.processTuple(tuple, 0).next() } - } it should "raise RuntimeException on duplicate alias" in { - val projectionOpExec = new ProjectionOpExec( - List( - new AttributeUnit("field1", "f"), - new AttributeUnit("field2", "f") - ) + opDesc.attributes = List( + new AttributeUnit("field1", "f"), + new AttributeUnit("field2", "f") ) - + val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) assertThrows[RuntimeException] { projectionOpExec.processTuple(tuple, 0).next() } - } it should "allow empty alias" in { + opDesc.attributes = List( + new AttributeUnit("field2", "f2"), + new AttributeUnit("field1", "") + ) val outputSchema = Schema .builder() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("f2", AttributeType.INTEGER)) .build() - val projectionOpExec = new ProjectionOpExec( - List( - new AttributeUnit("field2", "f2"), - new AttributeUnit("field1", "") - ) - ) + val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) projectionOpExec.open() val outputTuple = @@ -150,5 +142,4 @@ class ProjectionOpExecSpec extends AnyFlatSpec with BeforeAndAfter { assert(outputTuple.getField[String](0) == "hello") assert(outputTuple.getField[Int](1) == 1) } - } diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExecSpec.scala index 896495041d0..aeab7443c1d 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExecSpec.scala @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.sortPartitions import edu.uci.ics.amber.core.tuple._ +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class SortPartitionsOpExecSpec extends AnyFlatSpec with BeforeAndAfter { @@ -22,15 +23,11 @@ class SortPartitionsOpExecSpec extends AnyFlatSpec with BeforeAndAfter { ) .build() - var opExec: SortPartitionOpExec = _ + val opDesc: SortPartitionsOpDesc = new SortPartitionsOpDesc() + opDesc.sortAttributeName = "field2" + var opExec: SortPartitionsOpExec = _ before { - opExec = new SortPartitionOpExec( - "field2", - 0, - 0, - 6, - 1 - ) + opExec = new SortPartitionsOpExec(objectMapper.writeValueAsString(opDesc)) } it should "open" in { diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpExecSpec.scala index f531661df3c..5cef468f7a4 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpExecSpec.scala @@ -1,21 +1,28 @@ package edu.uci.ics.amber.operator.source.fetcher import edu.uci.ics.amber.core.tuple.Schema +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class URLFetcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { val resultSchema: Schema = new URLFetcherOpDesc().sourceSchema() + val opDesc: URLFetcherOpDesc = new URLFetcherOpDesc() + it should "fetch url and output one tuple with raw bytes" in { - val fetcherOpExec = new URLFetcherOpExec("https://www.google.com", DecodingMethod.RAW_BYTES) + opDesc.url = "https://www.google.com" + opDesc.decodingMethod = DecodingMethod.RAW_BYTES + val fetcherOpExec = new URLFetcherOpExec(objectMapper.writeValueAsString(opDesc)) val iterator = fetcherOpExec.produceTuple() assert(iterator.next().getFields.toList.head.isInstanceOf[Array[Byte]]) assert(!iterator.hasNext) } it should "fetch url and output one tuple with UTF-8 string" in { - val fetcherOpExec = new URLFetcherOpExec("https://www.google.com", DecodingMethod.UTF_8) + opDesc.url = "https://www.google.com" + opDesc.decodingMethod = DecodingMethod.UTF_8 + val fetcherOpExec = new URLFetcherOpExec(objectMapper.writeValueAsString(opDesc)) val iterator = fetcherOpExec.produceTuple() assert(iterator.next().getFields.toList.head.isInstanceOf[String]) assert(!iterator.hasNext) diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala index cb9d031952b..bf1a6122e1c 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala @@ -26,10 +26,10 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { parallelCsvScanSourceOpDesc.fileName = Some(TestOperators.CountrySalesSmallCsvPath) parallelCsvScanSourceOpDesc.customDelimiter = Some(",") parallelCsvScanSourceOpDesc.hasHeader = true - parallelCsvScanSourceOpDesc.setFileUri( + parallelCsvScanSourceOpDesc.setResolvedFileName( FileResolver.resolve(parallelCsvScanSourceOpDesc.fileName.get) ) - val inferredSchema: Schema = parallelCsvScanSourceOpDesc.inferSchema() + val inferredSchema: Schema = parallelCsvScanSourceOpDesc.sourceSchema() assert(inferredSchema.getAttributes.length == 14) assert(inferredSchema.getAttribute("Order ID").getType == AttributeType.INTEGER) @@ -42,11 +42,11 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { parallelCsvScanSourceOpDesc.fileName = Some(TestOperators.CountrySalesHeaderlessSmallCsvPath) parallelCsvScanSourceOpDesc.customDelimiter = Some(",") parallelCsvScanSourceOpDesc.hasHeader = false - parallelCsvScanSourceOpDesc.setFileUri( + parallelCsvScanSourceOpDesc.setResolvedFileName( FileResolver.resolve(parallelCsvScanSourceOpDesc.fileName.get) ) - val inferredSchema: Schema = parallelCsvScanSourceOpDesc.inferSchema() + val inferredSchema: Schema = parallelCsvScanSourceOpDesc.sourceSchema() assert(inferredSchema.getAttributes.length == 14) assert(inferredSchema.getAttribute("column-10").getType == AttributeType.DOUBLE) @@ -58,9 +58,9 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { csvScanSourceOpDesc.fileName = Some(TestOperators.CountrySalesSmallMultiLineCsvPath) csvScanSourceOpDesc.customDelimiter = Some(",") csvScanSourceOpDesc.hasHeader = true - csvScanSourceOpDesc.setFileUri(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) + csvScanSourceOpDesc.setResolvedFileName(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) - val inferredSchema: Schema = csvScanSourceOpDesc.inferSchema() + val inferredSchema: Schema = csvScanSourceOpDesc.sourceSchema() assert(inferredSchema.getAttributes.length == 14) assert(inferredSchema.getAttribute("Order ID").getType == AttributeType.INTEGER) @@ -72,9 +72,9 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { csvScanSourceOpDesc.fileName = Some(TestOperators.CountrySalesHeaderlessSmallCsvPath) csvScanSourceOpDesc.customDelimiter = Some(",") csvScanSourceOpDesc.hasHeader = false - csvScanSourceOpDesc.setFileUri(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) + csvScanSourceOpDesc.setResolvedFileName(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) - val inferredSchema: Schema = csvScanSourceOpDesc.inferSchema() + val inferredSchema: Schema = csvScanSourceOpDesc.sourceSchema() assert(inferredSchema.getAttributes.length == 14) assert(inferredSchema.getAttribute("column-10").getType == AttributeType.DOUBLE) @@ -87,9 +87,9 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { Some(TestOperators.CountrySalesSmallMultiLineCustomDelimiterCsvPath) csvScanSourceOpDesc.customDelimiter = Some(";") csvScanSourceOpDesc.hasHeader = false - csvScanSourceOpDesc.setFileUri(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) + csvScanSourceOpDesc.setResolvedFileName(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) - val inferredSchema: Schema = csvScanSourceOpDesc.inferSchema() + val inferredSchema: Schema = csvScanSourceOpDesc.sourceSchema() assert(inferredSchema.getAttributes.length == 14) assert(inferredSchema.getAttribute("column-10").getType == AttributeType.DOUBLE) @@ -102,7 +102,7 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { Some(TestOperators.CountrySalesSmallMultiLineCustomDelimiterCsvPath) csvScanSourceOpDesc.customDelimiter = Some(";") csvScanSourceOpDesc.hasHeader = false - csvScanSourceOpDesc.setFileUri(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) + csvScanSourceOpDesc.setResolvedFileName(FileResolver.resolve(csvScanSourceOpDesc.fileName.get)) assert( !csvScanSourceOpDesc diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/text/FileScanSourceOpDescSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/text/FileScanSourceOpDescSpec.scala index 15ba5adc096..e8b07062cb7 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/text/FileScanSourceOpDescSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/text/FileScanSourceOpDescSpec.scala @@ -9,6 +9,7 @@ import edu.uci.ics.amber.operator.source.scan.{ FileScanSourceOpDesc, FileScanSourceOpExec } +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec @@ -18,12 +19,12 @@ class FileScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { before { fileScanSourceOpDesc = new FileScanSourceOpDesc() - fileScanSourceOpDesc.setFileUri(FileResolver.resolve(TestOperators.TestTextFilePath)) + fileScanSourceOpDesc.setResolvedFileName(FileResolver.resolve(TestOperators.TestTextFilePath)) fileScanSourceOpDesc.fileEncoding = FileDecodingMethod.UTF_8 } it should "infer schema with single column representing each line of text in normal text scan mode" in { - val inferredSchema: Schema = fileScanSourceOpDesc.inferSchema() + val inferredSchema: Schema = fileScanSourceOpDesc.sourceSchema() assert(inferredSchema.getAttributes.length == 1) assert(inferredSchema.getAttribute("line").getType == AttributeType.STRING) @@ -31,7 +32,7 @@ class FileScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { it should "infer schema with single column representing entire file in outputAsSingleTuple mode" in { fileScanSourceOpDesc.attributeType = FileAttributeType.SINGLE_STRING - val inferredSchema: Schema = fileScanSourceOpDesc.inferSchema() + val inferredSchema: Schema = fileScanSourceOpDesc.sourceSchema() assert(inferredSchema.getAttributes.length == 1) assert(inferredSchema.getAttribute("line").getType == AttributeType.STRING) @@ -41,7 +42,7 @@ class FileScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { fileScanSourceOpDesc.attributeType = FileAttributeType.STRING val customOutputAttributeName: String = "testing" fileScanSourceOpDesc.attributeName = customOutputAttributeName - val inferredSchema: Schema = fileScanSourceOpDesc.inferSchema() + val inferredSchema: Schema = fileScanSourceOpDesc.sourceSchema() assert(inferredSchema.getAttributes.length == 1) assert(inferredSchema.getAttribute("testing").getType == AttributeType.STRING) @@ -49,7 +50,7 @@ class FileScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { it should "infer schema with integer attribute type" in { fileScanSourceOpDesc.attributeType = FileAttributeType.INTEGER - val inferredSchema: Schema = fileScanSourceOpDesc.inferSchema() + val inferredSchema: Schema = fileScanSourceOpDesc.sourceSchema() assert(inferredSchema.getAttributes.length == 1) assert(inferredSchema.getAttribute("line").getType == AttributeType.INTEGER) @@ -59,15 +60,7 @@ class FileScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { fileScanSourceOpDesc.attributeType = FileAttributeType.STRING fileScanSourceOpDesc.fileScanLimit = Option(5) val FileScanSourceOpExec = - new FileScanSourceOpExec( - fileScanSourceOpDesc.fileUri.get, - fileScanSourceOpDesc.attributeType, - fileScanSourceOpDesc.fileEncoding, - fileScanSourceOpDesc.extract, - fileScanSourceOpDesc.outputFileName, - fileScanSourceOpDesc.fileScanLimit, - fileScanSourceOpDesc.fileScanOffset - ) + new FileScanSourceOpExec(objectMapper.writeValueAsString(fileScanSourceOpDesc)) FileScanSourceOpExec.open() val processedTuple: Iterator[Tuple] = FileScanSourceOpExec .produceTuple() @@ -85,19 +78,13 @@ class FileScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { } it should "read first 5 lines of the input text file with CRLF separators into corresponding output tuples" in { - fileScanSourceOpDesc.setFileUri(FileResolver.resolve(TestOperators.TestCRLFTextFilePath)) + fileScanSourceOpDesc.setResolvedFileName( + FileResolver.resolve(TestOperators.TestCRLFTextFilePath) + ) fileScanSourceOpDesc.attributeType = FileAttributeType.STRING fileScanSourceOpDesc.fileScanLimit = Option(5) val FileScanSourceOpExec = - new FileScanSourceOpExec( - fileScanSourceOpDesc.fileUri.get, - fileScanSourceOpDesc.attributeType, - fileScanSourceOpDesc.fileEncoding, - fileScanSourceOpDesc.extract, - fileScanSourceOpDesc.outputFileName, - fileScanSourceOpDesc.fileScanLimit, - fileScanSourceOpDesc.fileScanOffset - ) + new FileScanSourceOpExec(objectMapper.writeValueAsString(fileScanSourceOpDesc)) FileScanSourceOpExec.open() val processedTuple: Iterator[Tuple] = FileScanSourceOpExec .produceTuple() @@ -117,15 +104,7 @@ class FileScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { it should "read first 5 lines of the input text file into a single output tuple" in { fileScanSourceOpDesc.attributeType = FileAttributeType.SINGLE_STRING val FileScanSourceOpExec = - new FileScanSourceOpExec( - fileScanSourceOpDesc.fileUri.get, - fileScanSourceOpDesc.attributeType, - fileScanSourceOpDesc.fileEncoding, - fileScanSourceOpDesc.extract, - fileScanSourceOpDesc.outputFileName, - fileScanSourceOpDesc.fileScanLimit, - fileScanSourceOpDesc.fileScanOffset - ) + new FileScanSourceOpExec(objectMapper.writeValueAsString(fileScanSourceOpDesc)) FileScanSourceOpExec.open() val processedTuple: Iterator[Tuple] = FileScanSourceOpExec .produceTuple() @@ -144,18 +123,13 @@ class FileScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { } it should "read first 5 lines of the input text into corresponding output INTEGER tuples" in { - fileScanSourceOpDesc.setFileUri(FileResolver.resolve(TestOperators.TestNumbersFilePath)) + fileScanSourceOpDesc.setResolvedFileName( + FileResolver.resolve(TestOperators.TestNumbersFilePath) + ) fileScanSourceOpDesc.attributeType = FileAttributeType.INTEGER fileScanSourceOpDesc.fileScanLimit = Option(5) - val FileScanSourceOpExec = new FileScanSourceOpExec( - fileScanSourceOpDesc.fileUri.get, - fileScanSourceOpDesc.attributeType, - fileScanSourceOpDesc.fileEncoding, - fileScanSourceOpDesc.extract, - fileScanSourceOpDesc.outputFileName, - fileScanSourceOpDesc.fileScanLimit, - fileScanSourceOpDesc.fileScanOffset - ) + val FileScanSourceOpExec = + new FileScanSourceOpExec(objectMapper.writeValueAsString(fileScanSourceOpDesc)) FileScanSourceOpExec.open() val processedTuple: Iterator[Tuple] = FileScanSourceOpExec .produceTuple() @@ -173,20 +147,14 @@ class FileScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { } it should "read first 5 lines of the input text file with US_ASCII encoding" in { - fileScanSourceOpDesc.setFileUri(FileResolver.resolve(TestOperators.TestCRLFTextFilePath)) + fileScanSourceOpDesc.setResolvedFileName( + FileResolver.resolve(TestOperators.TestCRLFTextFilePath) + ) fileScanSourceOpDesc.fileEncoding = FileDecodingMethod.ASCII fileScanSourceOpDesc.attributeType = FileAttributeType.STRING fileScanSourceOpDesc.fileScanLimit = Option(5) val FileScanSourceOpExec = - new FileScanSourceOpExec( - fileScanSourceOpDesc.fileUri.get, - fileScanSourceOpDesc.attributeType, - fileScanSourceOpDesc.fileEncoding, - fileScanSourceOpDesc.extract, - fileScanSourceOpDesc.outputFileName, - fileScanSourceOpDesc.fileScanLimit, - fileScanSourceOpDesc.fileScanOffset - ) + new FileScanSourceOpExec(objectMapper.writeValueAsString(fileScanSourceOpDesc)) FileScanSourceOpExec.open() val processedTuple: Iterator[Tuple] = FileScanSourceOpExec .produceTuple() diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDescSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDescSpec.scala index 8f600070fc1..e94e3d9570e 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDescSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDescSpec.scala @@ -3,6 +3,7 @@ package edu.uci.ics.amber.operator.source.scan.text import edu.uci.ics.amber.core.tuple.{AttributeType, Schema, SchemaEnforceable, Tuple} import edu.uci.ics.amber.operator.TestOperators import edu.uci.ics.amber.operator.source.scan.FileAttributeType +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec @@ -51,8 +52,11 @@ class TextInputSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { it should "read first 5 lines of the input text into corresponding output tuples" in { val inputString: String = readFileIntoString(TestOperators.TestTextFilePath) + textInputSourceOpDesc.attributeType = FileAttributeType.STRING + textInputSourceOpDesc.textInput = inputString + textInputSourceOpDesc.fileScanLimit = Option(5) val textScanSourceOpExec = - new TextInputSourceOpExec(FileAttributeType.STRING, inputString, fileScanLimit = Option(5)) + new TextInputSourceOpExec(objectMapper.writeValueAsString(textInputSourceOpDesc)) textScanSourceOpExec.open() val processedTuple: Iterator[Tuple] = textScanSourceOpExec .produceTuple() @@ -73,8 +77,11 @@ class TextInputSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { it should "read first 5 lines of the input text with CRLF separators into corresponding output tuples" in { val inputString: String = readFileIntoString(TestOperators.TestCRLFTextFilePath) + textInputSourceOpDesc.attributeType = FileAttributeType.STRING + textInputSourceOpDesc.textInput = inputString + textInputSourceOpDesc.fileScanLimit = Option(5) val textScanSourceOpExec = - new TextInputSourceOpExec(FileAttributeType.STRING, inputString, fileScanLimit = Option(5)) + new TextInputSourceOpExec(objectMapper.writeValueAsString(textInputSourceOpDesc)) textScanSourceOpExec.open() val processedTuple: Iterator[Tuple] = textScanSourceOpExec .produceTuple() @@ -95,8 +102,10 @@ class TextInputSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { it should "read first 5 lines of the input text into a single output tuple" in { val inputString: String = readFileIntoString(TestOperators.TestTextFilePath) + textInputSourceOpDesc.attributeType = FileAttributeType.SINGLE_STRING + textInputSourceOpDesc.textInput = inputString val textScanSourceOpExec = - new TextInputSourceOpExec(FileAttributeType.SINGLE_STRING, inputString) + new TextInputSourceOpExec(objectMapper.writeValueAsString(textInputSourceOpDesc)) textScanSourceOpExec.open() val processedTuple: Iterator[Tuple] = textScanSourceOpExec .produceTuple() @@ -119,8 +128,10 @@ class TextInputSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { it should "read first 5 lines of the input text into corresponding output INTEGER tuples" in { val inputString: String = readFileIntoString(TestOperators.TestNumbersFilePath) textInputSourceOpDesc.attributeType = FileAttributeType.INTEGER + textInputSourceOpDesc.textInput = inputString + textInputSourceOpDesc.fileScanLimit = Option(5) val textScanSourceOpExec = - new TextInputSourceOpExec(FileAttributeType.INTEGER, inputString, fileScanLimit = Option(5)) + new TextInputSourceOpExec(objectMapper.writeValueAsString(textInputSourceOpDesc)) textScanSourceOpExec.open() val processedTuple: Iterator[Tuple] = textScanSourceOpExec .produceTuple() diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala index ae1070f62ba..39b3b5e8d60 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.typecasting import edu.uci.ics.amber.core.tuple._ +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class TypeCastingOpExecSpec extends AnyFlatSpec with BeforeAndAfter { @@ -27,6 +28,8 @@ class TypeCastingOpExecSpec extends AnyFlatSpec with BeforeAndAfter { castingUnit2.resultType = AttributeType.STRING val castingUnits: List[TypeCastingUnit] = List(castingUnit1, castingUnit2) + val opDesc: TypeCastingOpDesc = new TypeCastingOpDesc() + opDesc.typeCastingUnits = castingUnits val tuple: Tuple = Tuple .builder(tupleSchema) .add(new Attribute("field1", AttributeType.STRING), "hello") @@ -42,14 +45,15 @@ class TypeCastingOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .build() it should "open" in { - val typeCastingOpExec = new TypeCastingOpExec(castingUnits) + + val typeCastingOpExec = new TypeCastingOpExec(objectMapper.writeValueAsString(opDesc)) typeCastingOpExec.open() } it should "process Tuple" in { - val typeCastingOpExec = new TypeCastingOpExec(castingUnits) + val typeCastingOpExec = new TypeCastingOpExec(objectMapper.writeValueAsString(opDesc)) typeCastingOpExec.open() diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala index 96cfb08acde..8ce75b6fd5f 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala @@ -2,6 +2,7 @@ package edu.uci.ics.amber.operator.unneststring import edu.uci.ics.amber.core.tuple._ import edu.uci.ics.amber.core.workflow.PortIdentity +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { @@ -32,14 +33,18 @@ class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "open" in { - opExec = new UnnestStringOpExec(attributeName = "field1", delimiter = "-") + opDesc.attribute = "field1" + opDesc.delimiter = "-" + opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) opExec.open() assert(opExec.flatMapFunc != null) } it should "split value in the given attribute and output the split result in the result attribute, one for each tuple" in { - opExec = new UnnestStringOpExec(attributeName = "field1", delimiter = "-") + opDesc.attribute = "field1" + opDesc.delimiter = "-" + opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) opExec.open() val processedTuple = opExec @@ -54,7 +59,8 @@ class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { it should "generate the correct tuple when there is no delimiter in the value" in { opDesc.attribute = "field3" - opExec = new UnnestStringOpExec(attributeName = "field3", delimiter = "-") + opDesc.delimiter = "-" + opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) opExec.open() val processedTuple = opExec @@ -66,8 +72,9 @@ class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "only contain split results that are not null" in { + opDesc.attribute = "field1" opDesc.delimiter = "/" - opExec = new UnnestStringOpExec(attributeName = "field1", delimiter = "/") + opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) val tuple: Tuple = Tuple .builder(tupleSchema) @@ -87,8 +94,9 @@ class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "split by regex delimiter" in { + opDesc.attribute = "field1" opDesc.delimiter = "<\\d*>" - opExec = new UnnestStringOpExec(attributeName = "field1", delimiter = "<\\d*>") + opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) val tuple: Tuple = Tuple .builder(tupleSchema) diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExecSpec.scala index 820534496bd..f8aa526a0c9 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExecSpec.scala @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.visualization.htmlviz import edu.uci.ics.amber.core.tuple._ +import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class HtmlVizOpExecSpec extends AnyFlatSpec with BeforeAndAfter { @@ -8,9 +9,9 @@ class HtmlVizOpExecSpec extends AnyFlatSpec with BeforeAndAfter { new Attribute("field1", AttributeType.STRING), new Attribute("field2", AttributeType.STRING) ) - val desc: HtmlVizOpDesc = new HtmlVizOpDesc() + val opDesc: HtmlVizOpDesc = new HtmlVizOpDesc() - val outputSchema: Schema = desc.getOutputSchema(Array(schema)) + val outputSchema: Schema = opDesc.getOutputSchema(Array(schema)) def tuple(): Tuple = Tuple @@ -19,7 +20,8 @@ class HtmlVizOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .build() it should "process a target field" in { - val htmlVizOpExec = new HtmlVizOpExec("field1") + opDesc.htmlContentAttrName = "field1" + val htmlVizOpExec = new HtmlVizOpExec(objectMapper.writeValueAsString(opDesc)) htmlVizOpExec.open() val processedTuple: Tuple = htmlVizOpExec @@ -33,8 +35,8 @@ class HtmlVizOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } it should "process another target field" in { - - val htmlVizOpExec = new HtmlVizOpExec("field2") + opDesc.htmlContentAttrName = "field2" + val htmlVizOpExec = new HtmlVizOpExec(objectMapper.writeValueAsString(opDesc)) htmlVizOpExec.open() val processedTuple: Tuple = htmlVizOpExec From 90f99e52710dc7daf307fbf708a6efde96c7ea55 Mon Sep 17 00:00:00 2001 From: Shengquan Ni <13672781+shengquan-ni@users.noreply.github.com> Date: Tue, 31 Dec 2024 13:41:15 -0800 Subject: [PATCH 23/47] Enhance error handling and stack trace formatting (#3185) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit This PR improves exception handling in AsyncRPCServer to unwrap the actual exception from InvocationTargetException. Old: 截屏2024-12-31 上午2 38 34 New: 截屏2024-12-31 上午2 33 18 --- .../engine/common/rpc/AsyncRPCServer.scala | 8 ++++++- .../edu/uci/ics/amber/error/ErrorUtils.scala | 22 +++++++++++-------- 2 files changed, 20 insertions(+), 10 deletions(-) diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCServer.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCServer.scala index eae8f455049..1977bc9764d 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCServer.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/common/rpc/AsyncRPCServer.scala @@ -59,7 +59,13 @@ class AsyncRPCServer( ): Unit = { try { val result = - method.invoke(handler, requestArg, contextArg) + try { + method.invoke(handler, requestArg, contextArg) + } catch { + case e: java.lang.reflect.InvocationTargetException => + throw Option(e.getCause).getOrElse(e) + case e: Throwable => throw e + } result .asInstanceOf[Future[ControlReturn]] .onSuccess { ret => diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/error/ErrorUtils.scala b/core/amber/src/main/scala/edu/uci/ics/amber/error/ErrorUtils.scala index c1569587fbb..f2f97d56192 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/error/ErrorUtils.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/error/ErrorUtils.scala @@ -39,8 +39,13 @@ object ErrorUtils { } def mkControlError(err: Throwable): ControlError = { - val stacktrace = err.getStackTrace.mkString("\n") - ControlError(err.toString, err.getCause.toString, stacktrace, ErrorLanguage.SCALA) + // Format each stack trace element with "at " prefix + val stacktrace = err.getStackTrace.map(element => s"at ${element}").mkString("\n") + if (err.getCause != null) { + ControlError(err.toString, err.getCause.toString, stacktrace, ErrorLanguage.SCALA) + } else { + ControlError(err.toString, "", stacktrace, ErrorLanguage.SCALA) + } } def reconstructThrowable(controlError: ControlError): Throwable = { @@ -52,14 +57,13 @@ object ErrorUtils { val causeThrowable = new Throwable(controlError.errorDetails) reconstructedThrowable.initCause(causeThrowable) } - val stackTraceElements = controlError.stackTrace.split("\n").map { line => - // You need to split each line appropriately to extract the class, method, file, and line number - val stackTracePattern = """\s*at\s+(.+)\((.+):(\d+)\)""".r + + val stackTracePattern = """\s*at\s+(.+)\((.*)\)""".r + val stackTraceElements = controlError.stackTrace.split("\n").flatMap { line => line match { - case stackTracePattern(className, fileName, lineNumber) => - new StackTraceElement(className, "", fileName, lineNumber.toInt) - case _ => - new StackTraceElement("", "", null, -1) // Handle if stack trace format is invalid + case stackTracePattern(className, location) => + Some(new StackTraceElement(className, "", location, -1)) + case _ => None } } reconstructedThrowable.setStackTrace(stackTraceElements) From 0eb36d095734f1c510e35999be72ebf25433dd5f Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Tue, 31 Dec 2024 15:14:24 -0800 Subject: [PATCH 24/47] Remove logical schema propagation (#3186) This PR removes all schema propagation functions from the logical plan. Developers are now required to implement `SchemaPropagationFunc` directly within the PhysicalPlan. This ensures that each PhysicalOp has its own distinct schema propagation logic, aligning schema handling more closely with the execution layer. To accommodate the need for schema propagation in the logical plan (primarily for testing purposes), a new method, `getExternalOutputSchemas`, has been introduced. This method facilitates the propagation of schemas across all PhysicalOps within a logical operator, ensuring compatibility with existing testing workflows. --- .../ics/amber/compiler/WorkflowCompiler.scala | 5 +- .../amber/core/workflow/PhysicalPlan.scala | 28 +++++- .../uci/ics/amber/operator/LogicalOp.scala | 59 ++++++----- .../operator/PythonOperatorDescriptor.scala | 15 +-- .../operator/aggregate/AggregateOpDesc.scala | 52 +++------- .../CartesianProductOpDesc.scala | 98 ++++++++----------- .../dictionary/DictionaryMatcherOpDesc.scala | 25 +++-- .../difference/DifferenceOpDesc.scala | 16 ++- .../operator/distinct/DistinctOpDesc.scala | 13 +-- .../amber/operator/dummy/DummyOpDesc.scala | 3 - .../amber/operator/filter/FilterOpDesc.scala | 9 +- .../operator/hashJoin/HashJoinOpDesc.scala | 60 +++++------- ...gingFaceIrisLogisticRegressionOpDesc.scala | 20 ++-- .../HuggingFaceSentimentAnalysisOpDesc.scala | 22 +++-- .../HuggingFaceSpamSMSDetectionOpDesc.scala | 20 ++-- .../HuggingFaceTextSummarizationOpDesc.scala | 18 ++-- .../operator/intersect/IntersectOpDesc.scala | 13 +-- .../intervalJoin/IntervalJoinOpDesc.scala | 42 +++----- .../amber/operator/limit/LimitOpDesc.scala | 3 - .../Scorer/MachineLearningScorerOpDesc.scala | 8 +- .../base/SklearnAdvancedBaseDesc.scala | 8 +- .../ics/amber/operator/map/MapOpDesc.scala | 20 +--- .../projection/ProjectionOpDesc.scala | 46 ++++----- .../ReservoirSamplingOpDesc.scala | 7 -- .../sentiment/SentimentAnalysisOpDesc.scala | 19 ++-- .../sklearn/SklearnClassifierOpDesc.scala | 16 +-- .../SklearnLinearRegressionOpDesc.scala | 16 +-- .../sklearn/SklearnPredictionOpDesc.scala | 19 ++-- .../ics/amber/operator/sort/SortOpDesc.scala | 7 +- .../sortPartitions/SortPartitionsOpDesc.scala | 7 -- .../source/SourceOperatorDescriptor.scala | 6 -- .../reddit/RedditSearchSourceOpDesc.scala | 6 +- .../amber/operator/split/SplitOpDesc.scala | 23 ++--- .../SymmetricDifferenceOpDesc.scala | 21 ++-- .../typecasting/TypeCastingOpDesc.scala | 6 -- .../operator/udf/java/JavaUDFOpDesc.scala | 20 ---- .../DualInputPortsPythonUDFOpDescV2.scala | 66 +++++-------- .../python/PythonLambdaFunctionOpDesc.scala | 13 ++- .../udf/python/PythonTableReducerOpDesc.scala | 30 +++--- .../udf/python/PythonUDFOpDescV2.scala | 55 +++-------- .../source/PythonUDFSourceOpDescV2.scala | 12 +-- .../ics/amber/operator/udf/r/RUDFOpDesc.scala | 47 ++------- .../operator/udf/r/RUDFSourceOpDesc.scala | 11 +-- .../amber/operator/union/UnionOpDesc.scala | 13 +-- .../unneststring/UnnestStringOpDesc.scala | 25 +++-- .../visualization/DotPlot/DotPlotOpDesc.scala | 12 ++- .../IcicleChart/IcicleChartOpDesc.scala | 12 ++- .../ImageViz/ImageVisualizerOpDesc.scala | 12 ++- .../ScatterMatrixChartOpDesc.scala | 12 ++- .../barChart/BarChartOpDesc.scala | 16 ++- .../visualization/boxPlot/BoxPlotOpDesc.scala | 12 ++- .../bubbleChart/BubbleChartOpDesc.scala | 12 ++- .../CandlestickChartOpDesc.scala | 12 ++- .../ContinuousErrorBandsOpDesc.scala | 12 ++- .../contourPlot/ContourPlotOpDesc.scala | 12 ++- .../dumbbellPlot/DumbbellPlotOpDesc.scala | 12 ++- .../FigureFactoryTableOpDesc.scala | 12 ++- .../filledAreaPlot/FilledAreaPlotOpDesc.scala | 12 ++- .../funnelPlot/FunnelPlotOpDesc.scala | 12 ++- .../ganttChart/GanttChartOpDesc.scala | 12 ++- .../visualization/heatMap/HeatMapOpDesc.scala | 12 ++- .../hierarchychart/HierarchyChartOpDesc.scala | 13 ++- .../histogram/HistogramChartOpDesc.scala | 12 ++- .../visualization/htmlviz/HtmlVizOpDesc.scala | 19 ++-- .../lineChart/LineChartOpDesc.scala | 12 ++- .../pieChart/PieChartOpDesc.scala | 12 ++- .../quiverPlot/QuiverPlotOpDesc.scala | 12 ++- .../sankeyDiagram/SankeyDiagramOpDesc.scala | 12 ++- .../scatter3DChart/Scatter3dChartOpDesc.scala | 12 ++- .../scatterplot/ScatterplotOpDesc.scala | 12 ++- .../tablesChart/TablesPlotOpDesc.scala | 12 ++- .../ternaryPlot/TernaryPlotOpDesc.scala | 13 ++- .../visualization/urlviz/UrlVizOpDesc.scala | 20 ++-- .../waterfallChart/WaterfallChartOpDesc.scala | 12 ++- .../wordCloud/WordCloudOpDesc.scala | 12 ++- .../CartesianProductOpExecSpec.scala | 5 +- .../DictionaryMatcherOpExecSpec.scala | 3 +- .../operator/hashJoin/HashJoinOpSpec.scala | 21 ++-- .../intervalJoin/IntervalOpExecSpec.scala | 7 +- .../projection/ProjectionOpDescSpec.scala | 29 ++---- .../scan/csv/CSVScanSourceOpDescSpec.scala | 5 - .../PythonLambdaFunctionOpDescSpec.scala | 9 +- .../unneststring/UnnestStringOpExecSpec.scala | 12 +-- .../htmlviz/HtmlVizOpExecSpec.scala | 4 +- 84 files changed, 754 insertions(+), 770 deletions(-) diff --git a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala index 6dd8ebbce50..c199dfaf863 100644 --- a/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala +++ b/core/workflow-compiling-service/src/main/scala/edu/uci/ics/amber/compiler/WorkflowCompiler.scala @@ -110,6 +110,8 @@ class WorkflowCompiler( logicalPlan.getTopologicalOpIds.asScala.foreach(logicalOpId => Try { val logicalOp = logicalPlan.getOperator(logicalOpId) + val allUpstreamLinks = logicalPlan + .getUpstreamLinks(logicalOp.operatorIdentifier) val subPlan = logicalOp.getPhysicalPlan(context.workflowId, context.executionId) subPlan @@ -117,8 +119,7 @@ class WorkflowCompiler( .map(subPlan.getOperator) .foreach({ physicalOp => { - val externalLinks = logicalPlan - .getUpstreamLinks(logicalOp.operatorIdentifier) + val externalLinks = allUpstreamLinks .filter(link => physicalOp.inputPorts.contains(link.toPortId)) .flatMap { link => physicalPlan diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalPlan.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalPlan.scala index a405ea646da..1c3d06519c0 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalPlan.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalPlan.scala @@ -2,13 +2,13 @@ package edu.uci.ics.amber.core.workflow import com.fasterxml.jackson.annotation.JsonIgnore import com.typesafe.scalalogging.LazyLogging -import edu.uci.ics.amber.util.VirtualIdentityUtils +import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.virtualidentity.{ ActorVirtualIdentity, OperatorIdentity, PhysicalOpIdentity } -import edu.uci.ics.amber.core.workflow.PhysicalLink +import edu.uci.ics.amber.util.VirtualIdentityUtils import org.jgrapht.alg.connectivity.BiconnectivityInspector import org.jgrapht.alg.shortestpath.AllDirectedPaths import org.jgrapht.graph.DirectedAcyclicGraph @@ -285,4 +285,28 @@ case class PhysicalPlan( chains.filter(s1 => chains.forall(s2 => s1 == s2 || !s1.subsetOf(s2))).toSet } + def propagateSchema(inputSchemas: Map[PortIdentity, Schema]): PhysicalPlan = { + var physicalPlan = PhysicalPlan(operators = Set.empty, links = Set.empty) + this + .topologicalIterator() + .map(this.getOperator) + .foreach({ physicalOp => + { + val propagatedPhysicalOp = physicalOp.inputPorts.keys.foldLeft(physicalOp) { + (op, inputPortId) => + op.propagateSchema(inputSchemas.get(inputPortId).map(schema => (inputPortId, schema))) + } + + // Add the operator to the physical plan + physicalPlan = physicalPlan.addOperator(propagatedPhysicalOp.propagateSchema()) + + // Add internal links to the physical plan + physicalPlan = getUpstreamPhysicalLinks(physicalOp.id).foldLeft(physicalPlan) { + (plan, link) => + plan.addLink(link) + } + } + }) + physicalPlan + } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala index 08e750ed8c3..56316f4e23c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/LogicalOp.scala @@ -5,7 +5,13 @@ import com.fasterxml.jackson.annotation._ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.executor.OperatorExecutor import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{PhysicalOp, PhysicalPlan} +import edu.uci.ics.amber.core.virtualidentity.{ + ExecutionIdentity, + OperatorIdentity, + WorkflowIdentity +} +import edu.uci.ics.amber.core.workflow.WorkflowContext.{DEFAULT_EXECUTION_ID, DEFAULT_WORKFLOW_ID} +import edu.uci.ics.amber.core.workflow.{PhysicalOp, PhysicalPlan, PortIdentity} import edu.uci.ics.amber.operator.aggregate.AggregateOpDesc import edu.uci.ics.amber.operator.cartesianProduct.CartesianProductOpDesc import edu.uci.ics.amber.operator.dictionary.DictionaryMatcherOpDesc @@ -94,16 +100,9 @@ import edu.uci.ics.amber.operator.visualization.ternaryPlot.TernaryPlotOpDesc import edu.uci.ics.amber.operator.visualization.urlviz.UrlVizOpDesc import edu.uci.ics.amber.operator.visualization.waterfallChart.WaterfallChartOpDesc import edu.uci.ics.amber.operator.visualization.wordCloud.WordCloudOpDesc -import edu.uci.ics.amber.core.virtualidentity.{ - ExecutionIdentity, - OperatorIdentity, - WorkflowIdentity -} -import edu.uci.ics.amber.core.workflow.PortIdentity import org.apache.commons.lang3.builder.{EqualsBuilder, HashCodeBuilder, ToStringBuilder} import java.util.UUID -import scala.collection.mutable import scala.util.Try trait StateTransferFunc @@ -288,19 +287,12 @@ abstract class LogicalOp extends PortDescriptor with Serializable { @JsonProperty(PropertyNameConstants.OPERATOR_VERSION) var operatorVersion: String = getOperatorVersion - @JsonIgnore - val inputPortToSchemaMapping: mutable.Map[PortIdentity, Schema] = mutable.HashMap() - @JsonIgnore - val outputPortToSchemaMapping: mutable.Map[PortIdentity, Schema] = mutable.HashMap() - def operatorIdentifier: OperatorIdentity = OperatorIdentity(operatorId) def getPhysicalOp( workflowId: WorkflowIdentity, executionId: ExecutionIdentity - ): PhysicalOp = { - ??? - } + ): PhysicalOp = ??? // a logical operator corresponds multiple physical operators (a small DAG) def getPhysicalPlan( @@ -315,19 +307,12 @@ abstract class LogicalOp extends PortDescriptor with Serializable { def operatorInfo: OperatorInfo - def getOutputSchema(schemas: Array[Schema]): Schema - private def getOperatorVersion: String = { val path = "core/amber/src/main/scala/" val operatorPath = path + this.getClass.getPackage.getName.replace(".", "/") OPVersion.getVersion(this.getClass.getSimpleName, operatorPath) } - // override if the operator has multiple output ports, schema must be specified for each port - def getOutputSchemas(schemas: Array[Schema]): Array[Schema] = { - Array.fill(1)(getOutputSchema(schemas)) - } - override def hashCode: Int = HashCodeBuilder.reflectionHashCode(this) override def equals(that: Any): Boolean = EqualsBuilder.reflectionEquals(this, that, "context") @@ -354,4 +339,32 @@ abstract class LogicalOp extends PortDescriptor with Serializable { @JsonPropertyDescription("Add dummy property if needed") var dummyPropertyList: List[DummyProperties] = List() + /** + * Propagates the schema from external input ports to external output ports. + * This method is primarily used to derive the output schemas for logical operators. + * + * @param inputSchemas A map containing the schemas of the external input ports. + * @return A map of external output port identities to their corresponding schemas. + */ + def getExternalOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + this + .getPhysicalPlan(DEFAULT_WORKFLOW_ID, DEFAULT_EXECUTION_ID) + .propagateSchema(inputSchemas) + .operators + .flatMap { operator => + operator.outputPorts.values + .filterNot { case (port, _, _) => port.id.internal } // Exclude internal ports + .map { + case (port, _, schemaEither) => + schemaEither match { + case Left(error) => throw error + case Right(schema) => + port.id -> schema // Map external port ID to its schema + } + } + } + .toMap + } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala index 941db76f9d5..479352daa0c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/PythonOperatorDescriptor.scala @@ -1,8 +1,9 @@ package edu.uci.ics.amber.operator import edu.uci.ics.amber.core.executor.OpExecWithCode +import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.core.workflow.{PhysicalOp, PortIdentity, SchemaPropagationFunc} trait PythonOperatorDescriptor extends LogicalOp { override def getPhysicalOp( @@ -29,15 +30,7 @@ trait PythonOperatorDescriptor extends LogicalOp { .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withParallelizable(parallelizable()) - .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - operatorInfo.outputPorts.head.id -> getOutputSchema( - operatorInfo.inputPorts.map(_.id).map(inputSchemas(_)).toArray - ) - ) - ) - ) + .withPropagateSchema(SchemaPropagationFunc(inputSchemas => getOutputSchemas(inputSchemas))) } def parallelizable(): Boolean = false @@ -52,4 +45,6 @@ trait PythonOperatorDescriptor extends LogicalOp { */ def generatePythonCode(): String + def getOutputSchemas(inputSchemas: Map[PortIdentity, Schema]): Map[PortIdentity, Schema] + } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala index 14c138562f4..0ea2557f4ef 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala @@ -34,7 +34,7 @@ class AggregateOpDesc extends LogicalOp { workflowId: WorkflowIdentity, executionId: ExecutionIdentity ): PhysicalPlan = { - + if (groupByKeys == null) groupByKeys = List() // TODO: this is supposed to be blocking but due to limitations of materialization naming on the logical operator // we are keeping it not annotated as blocking. val inputPort = InputPort(PortIdentity()) @@ -53,12 +53,17 @@ class AggregateOpDesc extends LogicalOp { .withOutputPorts(List(outputPort)) .withPropagateSchema( SchemaPropagationFunc(inputSchemas => { - aggregations = localAggregations - Map( - PortIdentity(internal = true) -> getOutputSchema( - operatorInfo.inputPorts.map(port => inputSchemas(port.id)).toArray + val inputSchema = inputSchemas(operatorInfo.inputPorts.head.id) + val outputSchema = Schema + .builder() + .add(groupByKeys.map(key => inputSchema.getAttribute(key)): _*) + .add( + localAggregations.map(agg => + agg.getAggregationAttribute(inputSchema.getAttribute(agg.attribute).getType) + ) ) - ) + .build() + Map(PortIdentity(internal = true) -> outputSchema) }) ) @@ -81,9 +86,7 @@ class AggregateOpDesc extends LogicalOp { .withOutputPorts(List(finalOutputPort)) .withPropagateSchema( SchemaPropagationFunc(inputSchemas => - Map(operatorInfo.outputPorts.head.id -> { - inputSchemas(finalInputPort.id) - }) + Map(operatorInfo.outputPorts.head.id -> inputSchemas(finalInputPort.id)) ) ) .withPartitionRequirement(List(Option(HashPartition(groupByKeys)))) @@ -104,34 +107,7 @@ class AggregateOpDesc extends LogicalOp { "Aggregate", "Calculate different types of aggregation values", OperatorGroupConstants.AGGREGATE_GROUP, - inputPorts = List( - InputPort(PortIdentity()) - ), - outputPorts = List( - OutputPort(PortIdentity()) - ) + inputPorts = List(InputPort()), + outputPorts = List(OutputPort()) ) - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - if ( - aggregations.exists(agg => agg.resultAttribute == null || agg.resultAttribute.trim.isEmpty) - ) { - return null - } - if (groupByKeys == null) groupByKeys = List() - Schema - .builder() - .add( - Schema - .builder() - .add(groupByKeys.map(key => schemas(0).getAttribute(key)): _*) - .build() - ) - .add( - aggregations.map(agg => - agg.getAggregationAttribute(schemas(0).getAttribute(agg.attribute).getType) - ) - ) - .build() - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala index 7e71d29b42b..c17a94e3a40 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala @@ -22,16 +22,49 @@ class CartesianProductOpDesc extends LogicalOp { .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - operatorInfo.outputPorts.head.id -> getOutputSchema( - Array( - inputSchemas(operatorInfo.inputPorts.head.id), - inputSchemas(operatorInfo.inputPorts.last.id) - ) - ) - ) - ) + SchemaPropagationFunc(inputSchemas => { + + // Combines the left and right input schemas into a single output schema. + // + // - The output schema includes all attributes from the left schema first, followed by + // attributes from the right schema. + // - Duplicate attribute names are resolved by appending an increasing suffix (e.g., `#@1`, `#@2`). + // - Attributes from the left schema retain their original names in the output schema. + // + // Example: + // Left schema: (dup, dup#@1, dup#@2) + // Right schema: (r1, r2, dup) + // Output schema: (dup, dup#@1, dup#@2, r1, r2, dup#@3) + // + // In this example, the last attribute from the right schema (`dup`) is renamed to `dup#@3` + // to avoid conflicts. + + val builder = Schema.builder() + val leftSchema = inputSchemas(operatorInfo.inputPorts.head.id) + val rightSchema = inputSchemas(operatorInfo.inputPorts.last.id) + val leftAttributeNames = leftSchema.getAttributeNames + val rightAttributeNames = rightSchema.getAttributeNames + builder.add(leftSchema) + rightSchema.getAttributes.foreach(attr => { + var newName = attr.getName + while ( + leftAttributeNames.contains(newName) || rightAttributeNames + .filterNot(attrName => attrName == attr.getName) + .contains(newName) + ) { + newName = s"$newName#@1" + } + if (newName == attr.getName) { + // non-duplicate attribute, add to builder as is + builder.add(attr) + } else { + // renamed the duplicate attribute, construct new Attribute + builder.add(new Attribute(newName, attr.getType)) + } + }) + val outputSchema = builder.build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) + }) ) // TODO : refactor to parallelize this operator for better performance and scalability: // can consider hash partition on larger input, broadcast smaller table to each partition @@ -39,46 +72,6 @@ class CartesianProductOpDesc extends LogicalOp { } - /** - * returns a Schema in order of the left input attributes followed by the right attributes - * duplicate attribute names are handled with an increasing suffix count - * - * Left schema attributes should always retain the same name in output schema - * - * For example, Left(dup, dup#@1, dup#@2) cartesian product with Right(r1, r2, dup) - * has output schema: (dup, dup#@1, dup#@2, r1, r2, dup#@3) - * - * Since the last attribute of Right is a duplicate, it increases suffix until it is - * no longer a duplicate, resulting in dup#@3 - */ - def getOutputSchemaInternal(schemas: Array[Schema]): Schema = { - // merge left / right schemas together, sequentially with left schema first - val builder = Schema.builder() - val leftSchema = schemas(0) - val leftAttributeNames = leftSchema.getAttributeNames - val rightSchema = schemas(1) - val rightAttributeNames = rightSchema.getAttributeNames - builder.add(leftSchema) - rightSchema.getAttributes.foreach(attr => { - var newName = attr.getName - while ( - leftAttributeNames.contains(newName) || rightAttributeNames - .filterNot(attrName => attrName == attr.getName) - .contains(newName) - ) { - newName = s"$newName#@1" - } - if (newName == attr.getName) { - // non-duplicate attribute, add to builder as is - builder.add(attr) - } else { - // renamed the duplicate attribute, construct new Attribute - builder.add(new Attribute(newName, attr.getType)) - } - }) - builder.build() - } - override def operatorInfo: OperatorInfo = OperatorInfo( "Cartesian Product", @@ -90,9 +83,4 @@ class CartesianProductOpDesc extends LogicalOp { ), outputPorts = List(OutputPort()) ) - - // remove duplicates in attribute names - override def getOutputSchema(schemas: Array[Schema]): Schema = { - getOutputSchemaInternal(schemas) - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala index 4a2cb463355..2a82b03d10b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala @@ -1,16 +1,14 @@ package edu.uci.ics.amber.operator.dictionary import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import com.google.common.base.Preconditions import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.util.JSONUtils.objectMapper -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} /** * Dictionary matcher operator matches a tuple if the specified column is in the given dictionary. @@ -48,9 +46,16 @@ class DictionaryMatcherOpDesc extends MapOpDesc { .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map(operatorInfo.outputPorts.head.id -> getOutputSchema(inputSchemas.values.toArray)) - ) + SchemaPropagationFunc(inputSchemas => { + if (resultAttribute == null || resultAttribute.trim.isEmpty) return null + Map( + operatorInfo.outputPorts.head.id -> Schema + .builder() + .add(inputSchemas.values.head) + .add(resultAttribute, AttributeType.BOOLEAN) + .build() + ) + }) ) } @@ -63,10 +68,4 @@ class DictionaryMatcherOpDesc extends MapOpDesc { outputPorts = List(OutputPort()), supportReconfiguration = true ) - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.length == 1) - if (resultAttribute == null || resultAttribute.trim.isEmpty) return null - Schema.builder().add(schemas(0)).add(resultAttribute, AttributeType.BOOLEAN).build() - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala index 8c144b3756a..8cb81f186c3 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/difference/DifferenceOpDesc.scala @@ -2,12 +2,10 @@ package edu.uci.ics.amber.operator.difference import com.google.common.base.Preconditions import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow._ import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class DifferenceOpDesc extends LogicalOp { @@ -26,6 +24,11 @@ class DifferenceOpDesc extends LogicalOp { .withOutputPorts(operatorInfo.outputPorts) .withPartitionRequirement(List(Option(HashPartition()), Option(HashPartition()))) .withDerivePartition(_ => HashPartition()) + .withPropagateSchema(SchemaPropagationFunc(inputSchemas => { + Preconditions.checkArgument(inputSchemas.values.toSet.size == 1) + val outputSchema = inputSchemas.values.head + operatorInfo.outputPorts.map(port => port.id -> outputSchema).toMap + })) } override def operatorInfo: OperatorInfo = @@ -39,9 +42,4 @@ class DifferenceOpDesc extends LogicalOp { ), outputPorts = List(OutputPort(blocking = true)) ) - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.forall(_ == schemas(0))) - schemas(0) - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala index 30c2f9f4b27..7f851743b18 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/distinct/DistinctOpDesc.scala @@ -1,13 +1,10 @@ package edu.uci.ics.amber.operator.distinct -import com.google.common.base.Preconditions import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{HashPartition, InputPort, OutputPort, PhysicalOp} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class DistinctOpDesc extends LogicalOp { @@ -26,6 +23,7 @@ class DistinctOpDesc extends LogicalOp { .withOutputPorts(operatorInfo.outputPorts) .withPartitionRequirement(List(Option(HashPartition()))) .withDerivePartition(_ => HashPartition()) + } override def operatorInfo: OperatorInfo = @@ -37,9 +35,4 @@ class DistinctOpDesc extends LogicalOp { outputPorts = List(OutputPort(blocking = true)) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.forall(_ == schemas(0))) - schemas(0) - } - } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dummy/DummyOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dummy/DummyOpDesc.scala index 75ce5a933cd..8cdb0d5a5b5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dummy/DummyOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dummy/DummyOpDesc.scala @@ -2,7 +2,6 @@ package edu.uci.ics.amber.operator.dummy import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.{LogicalOp, PortDescription, PortDescriptor} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} @@ -48,6 +47,4 @@ class DummyOpDesc extends LogicalOp with PortDescriptor { allowPortCustomization = true ) } - - override def getOutputSchema(schemas: Array[Schema]): Schema = schemas(0) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpDesc.scala index 28c5e44a981..52f66143137 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/filter/FilterOpDesc.scala @@ -1,20 +1,13 @@ package edu.uci.ics.amber.operator.filter -import com.google.common.base.Preconditions -import edu.uci.ics.amber.core.tuple.Schema +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.{LogicalOp, StateTransferFunc} -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import scala.util.{Success, Try} abstract class FilterOpDesc extends LogicalOp { - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.length == 1) - schemas(0) - } - override def runtimeReconfiguration( workflowId: WorkflowIdentity, executionId: ExecutionIdentity, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala index 3777b2b6216..756f468f46d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala @@ -118,13 +118,32 @@ class HashJoinOpDesc[K] extends LogicalOp { .withDerivePartition(_ => HashPartition(List(probeAttributeName))) .withParallelizable(true) .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - PortIdentity() -> getOutputSchema( - Array(inputSchemas(PortIdentity(internal = true)), inputSchemas(PortIdentity(1))) - ) - ) - ) + SchemaPropagationFunc(inputSchemas => { + val buildSchema = inputSchemas(PortIdentity(internal = true)) + val probeSchema = inputSchemas(PortIdentity(1)) + val builder = Schema.builder() + builder.add(buildSchema) + builder.removeIfExists(HASH_JOIN_INTERNAL_KEY_NAME) + val leftAttributeNames = buildSchema.getAttributeNames + val rightAttributeNames = + probeSchema.getAttributeNames.filterNot(name => name == probeAttributeName) + + // Create a Map from rightTuple's fields, renaming conflicts + rightAttributeNames + .foreach { name => + var newName = name + while ( + leftAttributeNames.contains(newName) || rightAttributeNames + .filter(attrName => name != attrName) + .contains(newName) + ) { + newName = s"$newName#@1" + } + builder.add(new Attribute(newName, probeSchema.getAttribute(name).getType)) + } + val outputSchema = builder.build() + Map(PortIdentity() -> outputSchema) + }) ) PhysicalPlan( @@ -151,31 +170,4 @@ class HashJoinOpDesc[K] extends LogicalOp { ), outputPorts = List(OutputPort()) ) - - // remove the probe attribute in the output - override def getOutputSchema(schemas: Array[Schema]): Schema = { - val buildSchema = schemas(0) - val probeSchema = schemas(1) - val builder = Schema.builder() - builder.add(buildSchema) - builder.removeIfExists(HASH_JOIN_INTERNAL_KEY_NAME) - val leftAttributeNames = buildSchema.getAttributeNames - val rightAttributeNames = - probeSchema.getAttributeNames.filterNot(name => name == probeAttributeName) - - // Create a Map from rightTuple's fields, renaming conflicts - rightAttributeNames - .foreach { name => - var newName = name - while ( - leftAttributeNames.contains(newName) || rightAttributeNames - .filter(attrName => name != attrName) - .contains(newName) - ) { - newName = s"$newName#@1" - } - builder.add(new Attribute(newName, probeSchema.getAttribute(name).getType)) - } - builder.build() - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala index a4efb8226cb..dcef5abf438 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class HuggingFaceIrisLogisticRegressionOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "petalLengthCmAttribute", required = true) @@ -90,17 +90,21 @@ class HuggingFaceIrisLogisticRegressionOpDesc extends PythonOperatorDescriptor { outputPorts = List(OutputPort()) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { if ( predictionClassName == null || predictionClassName.trim.isEmpty || predictionProbabilityName == null || predictionProbabilityName.trim.isEmpty ) throw new RuntimeException("Result attribute name should not be empty") - Schema - .builder() - .add(schemas(0)) - .add(predictionClassName, AttributeType.STRING) - .add(predictionProbabilityName, AttributeType.DOUBLE) - .build() + Map( + operatorInfo.outputPorts.head.id -> Schema + .builder() + .add(inputSchemas(operatorInfo.inputPorts.head.id)) + .add(predictionClassName, AttributeType.STRING) + .add(predictionProbabilityName, AttributeType.DOUBLE) + .build() + ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala index 04a603ed85c..5e9027951a9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.huggingFace import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -77,19 +77,23 @@ class HuggingFaceSentimentAnalysisOpDesc extends PythonOperatorDescriptor { supportReconfiguration = true ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { if ( resultAttributePositive == null || resultAttributePositive.trim.isEmpty || resultAttributeNeutral == null || resultAttributeNeutral.trim.isEmpty || resultAttributeNegative == null || resultAttributeNegative.trim.isEmpty ) return null - Schema - .builder() - .add(schemas(0)) - .add(resultAttributePositive, AttributeType.DOUBLE) - .add(resultAttributeNeutral, AttributeType.DOUBLE) - .add(resultAttributeNegative, AttributeType.DOUBLE) - .build() + Map( + operatorInfo.outputPorts.head.id -> Schema + .builder() + .add(inputSchemas(operatorInfo.inputPorts.head.id)) + .add(resultAttributePositive, AttributeType.DOUBLE) + .add(resultAttributeNeutral, AttributeType.DOUBLE) + .add(resultAttributeNegative, AttributeType.DOUBLE) + .build() + ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala index cf1c43dd701..4257c17a6d5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class HuggingFaceSpamSMSDetectionOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "attribute", required = true) @JsonPropertyDescription("column to perform spam detection on") @@ -54,12 +54,16 @@ class HuggingFaceSpamSMSDetectionOpDesc extends PythonOperatorDescriptor { outputPorts = List(OutputPort()) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema - .builder() - .add(schemas(0)) - .add(resultAttributeSpam, AttributeType.BOOLEAN) - .add(resultAttributeProbability, AttributeType.DOUBLE) - .build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + Map( + operatorInfo.outputPorts.head.id -> Schema + .builder() + .add(inputSchemas.values.head) + .add(resultAttributeSpam, AttributeType.BOOLEAN) + .add(resultAttributeProbability, AttributeType.DOUBLE) + .build() + ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala index 349842369fb..e79369fb959 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.huggingFace import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -57,13 +57,17 @@ class HuggingFaceTextSummarizationOpDesc extends PythonOperatorDescriptor { outputPorts = List(OutputPort()) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { if (resultAttribute == null || resultAttribute.trim.isEmpty) throw new RuntimeException("Result attribute name should be given") - Schema - .builder() - .add(schemas(0)) - .add(resultAttribute, AttributeType.STRING) - .build() + Map( + operatorInfo.outputPorts.head.id -> Schema + .builder() + .add(inputSchemas.values.head) + .add(resultAttribute, AttributeType.STRING) + .build() + ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala index 1de8534ac11..48cc74ea8a7 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intersect/IntersectOpDesc.scala @@ -1,13 +1,10 @@ package edu.uci.ics.amber.operator.intersect -import com.google.common.base.Preconditions import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow._ import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class IntersectOpDesc extends LogicalOp { @@ -36,10 +33,4 @@ class IntersectOpDesc extends LogicalOp { inputPorts = List(InputPort(PortIdentity()), InputPort(PortIdentity(1))), outputPorts = List(OutputPort(blocking = true)) ) - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.forall(_ == schemas(0))) - schemas(0) - } - } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala index 985b8f9c4d6..764a42b2708 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala @@ -91,16 +91,22 @@ class IntervalJoinOpDesc extends LogicalOp { .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - operatorInfo.outputPorts.head.id -> getOutputSchema( - Array( - inputSchemas(operatorInfo.inputPorts.head.id), - inputSchemas(operatorInfo.inputPorts.last.id) - ) - ) - ) - ) + SchemaPropagationFunc(inputSchemas => { + val builder: Schema.Builder = Schema.builder() + val leftTableSchema: Schema = inputSchemas(operatorInfo.inputPorts.head.id) + val rightTableSchema: Schema = inputSchemas(operatorInfo.inputPorts.last.id) + builder.add(leftTableSchema) + rightTableSchema.getAttributes + .map(attr => { + if (leftTableSchema.containsAttribute(attr.getName)) { + builder.add(new Attribute(s"${attr.getName}#@1", attr.getType)) + } else { + builder.add(attr.getName, attr.getType) + } + }) + val outputSchema = builder.build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) + }) ) .withPartitionRequirement(partitionRequirement) } @@ -138,20 +144,4 @@ class IntervalJoinOpDesc extends LogicalOp { this.timeIntervalType = Some(timeIntervalType) } - override def getOutputSchema(schemas: Array[Schema]): Schema = { - val builder: Schema.Builder = Schema.builder() - val leftTableSchema: Schema = schemas(0) - val rightTableSchema: Schema = schemas(1) - builder.add(leftTableSchema) - rightTableSchema.getAttributes - .map(attr => { - if (leftTableSchema.containsAttribute(attr.getName)) { - builder.add(new Attribute(s"${attr.getName}#@1", attr.getType)) - } else { - builder.add(attr.getName, attr.getType) - } - }) - builder.build() - } - } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala index 70ebe4725f4..ae0d7768bb1 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/limit/LimitOpDesc.scala @@ -3,7 +3,6 @@ package edu.uci.ics.amber.operator.limit import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.{LogicalOp, StateTransferFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -49,8 +48,6 @@ class LimitOpDesc extends LogicalOp { supportReconfiguration = true ) - override def getOutputSchema(schemas: Array[Schema]): Schema = schemas(0) - override def runtimeReconfiguration( workflowId: WorkflowIdentity, executionId: ExecutionIdentity, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala index 8183cf14e4c..62ca41b34eb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala @@ -10,7 +10,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.{AutofillAttributeName, HideAnnotation} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class MachineLearningScorerOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true, defaultValue = "false") @@ -64,7 +64,9 @@ class MachineLearningScorerOpDesc extends PythonOperatorDescriptor { inputPorts = List(InputPort()), outputPorts = List(OutputPort()) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { val outputSchemaBuilder = Schema.builder() if (!isRegression) { outputSchemaBuilder.add(new Attribute("Class", AttributeType.STRING)) @@ -79,7 +81,7 @@ class MachineLearningScorerOpDesc extends PythonOperatorDescriptor { outputSchemaBuilder.add(new Attribute(metricName, AttributeType.DOUBLE)) }) - outputSchemaBuilder.build() + Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) } // private def getClassificationScorerName(scorer: classificationMetricsFnc): String = { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala index 66467291eb0..0d35b6cbc85 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala @@ -149,9 +149,13 @@ abstract class SklearnMLOperatorDescriptor[T <: ParamClass] extends PythonOperat ) } - override def getOutputSchema(schemas: Array[Schema]): Schema = { + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { val outputSchemaBuilder = Schema.builder() outputSchemaBuilder.add(new Attribute("Model", AttributeType.BINARY)) - outputSchemaBuilder.add(new Attribute("Parameters", AttributeType.STRING)).build() + outputSchemaBuilder.add(new Attribute("Parameters", AttributeType.STRING)) + + Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/map/MapOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/map/MapOpDesc.scala index 5cce0ad9fb3..f47aca589be 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/map/MapOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/map/MapOpDesc.scala @@ -4,7 +4,7 @@ import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.{LogicalOp, StateTransferFunc} import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import scala.util.{Failure, Success, Try} +import scala.util.{Success, Try} abstract class MapOpDesc extends LogicalOp { @@ -14,22 +14,6 @@ abstract class MapOpDesc extends LogicalOp { oldOpDesc: LogicalOp, newOpDesc: LogicalOp ): Try[(PhysicalOp, Option[StateTransferFunc])] = { - val inputSchemas = oldOpDesc.operatorInfo.inputPorts - .map(inputPort => oldOpDesc.inputPortToSchemaMapping(inputPort.id)) - .toArray - val outputSchemas = oldOpDesc.operatorInfo.outputPorts - .map(outputPort => oldOpDesc.outputPortToSchemaMapping(outputPort.id)) - .toArray - val newOutputSchema = newOpDesc.getOutputSchema(inputSchemas) - if (!newOutputSchema.equals(outputSchemas.head)) { - Failure( - new UnsupportedOperationException( - "reconfigurations that change output schema are not supported" - ) - ) - } else { - Success(newOpDesc.getPhysicalOp(workflowId, executionId), None) - } + Success(newOpDesc.getPhysicalOp(workflowId, executionId), None) } - } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala index 47b80cfaef0..39183a07ea5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala @@ -39,10 +39,26 @@ class ProjectionOpDesc extends MapOpDesc { .withOutputPorts(operatorInfo.outputPorts) .withDerivePartition(derivePartition()) .withPropagateSchema(SchemaPropagationFunc(inputSchemas => { + Preconditions.checkArgument(attributes.nonEmpty) + val inputSchema = inputSchemas.values.head + val outputSchema = if (!isDrop) { + Schema + .builder() + .add(attributes.map { attribute => + val originalType = inputSchema.getAttribute(attribute.getOriginalAttribute).getType + new Attribute(attribute.getAlias, originalType) + }) + .build() + } else { + val outputSchemaBuilder = Schema.builder() + outputSchemaBuilder.add(inputSchema) + for (attribute <- attributes) { + outputSchemaBuilder.removeIfExists(attribute.getOriginalAttribute) + } + outputSchemaBuilder.build() + } Map( - operatorInfo.outputPorts.head.id -> getOutputSchema( - Array(inputSchemas(operatorInfo.inputPorts.head.id)) - ) + operatorInfo.outputPorts.head.id -> outputSchema ) })) } @@ -71,28 +87,4 @@ class ProjectionOpDesc extends MapOpDesc { outputPorts = List(OutputPort()) ) } - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.length == 1) - Preconditions.checkArgument(attributes.nonEmpty) - if (!isDrop) { - Schema - .builder() - .add(attributes.map { attribute => - val originalType = schemas.head.getAttribute(attribute.getOriginalAttribute).getType - new Attribute(attribute.getAlias, originalType) - }) - .build() - } else { - val outputSchemaBuilder = Schema.builder() - val inputSchema = schemas(0) - outputSchemaBuilder.add(inputSchema) - for (attribute <- attributes) { - outputSchemaBuilder.removeIfExists(attribute.getOriginalAttribute) - } - outputSchemaBuilder.build() - - } - - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala index cc1840609bf..cc65876a2f5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/reservoirsampling/ReservoirSamplingOpDesc.scala @@ -1,9 +1,7 @@ package edu.uci.ics.amber.operator.reservoirsampling import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import com.google.common.base.Preconditions import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.PhysicalOp import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -44,9 +42,4 @@ class ReservoirSamplingOpDesc extends LogicalOp { outputPorts = List(OutputPort()) ) } - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.length == 1) - schemas(0) - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala index 815b08bdb2a..155380851ba 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala @@ -55,9 +55,17 @@ class SentimentAnalysisOpDesc extends MapOpDesc { .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map(operatorInfo.outputPorts.head.id -> getOutputSchema(inputSchemas.values.toArray)) - ) + SchemaPropagationFunc(inputSchemas => { + if (resultAttribute == null || resultAttribute.trim.isEmpty) + return null + Map( + operatorInfo.outputPorts.head.id -> Schema + .builder() + .add(inputSchemas.values.head) + .add(resultAttribute, AttributeType.INTEGER) + .build() + ) + }) ) } @@ -71,9 +79,4 @@ class SentimentAnalysisOpDesc extends MapOpDesc { supportReconfiguration = true ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { - if (resultAttribute == null || resultAttribute.trim.isEmpty) - return null - Schema.builder().add(schemas(0)).add(resultAttribute, AttributeType.INTEGER).build() - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala index 09881901612..2279f4126dd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala @@ -106,11 +106,15 @@ abstract class SklearnClassifierOpDesc extends PythonOperatorDescriptor { outputPorts = List(OutputPort(blocking = true)) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema - .builder() - .add("model_name", AttributeType.STRING) - .add("model", AttributeType.BINARY) - .build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + Map( + operatorInfo.outputPorts.head.id -> Schema + .builder() + .add("model_name", AttributeType.STRING) + .add("model", AttributeType.BINARY) + .build() + ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala index a55cb953395..35e0e7d4d9d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala @@ -59,12 +59,16 @@ class SklearnLinearRegressionOpDesc extends PythonOperatorDescriptor { outputPorts = List(OutputPort(blocking = true)) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema - .builder() - .add("model_name", AttributeType.STRING) - .add("model", AttributeType.BINARY) - .build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + Map( + operatorInfo.outputPorts.head.id -> Schema + .builder() + .add("model_name", AttributeType.STRING) + .add("model", AttributeType.BINARY) + .build() + ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala index a1d4c86eb7e..6e3c8ae5cd7 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala @@ -58,16 +58,21 @@ class SklearnPredictionOpDesc extends PythonOperatorDescriptor { outputPorts = List(OutputPort()) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { var resultType = AttributeType.STRING + val inputSchema = inputSchemas(operatorInfo.inputPorts(1).id) if (groundTruthAttribute != "") { resultType = - schemas(1).attributes.find(attr => attr.getName == groundTruthAttribute).get.getType + inputSchema.attributes.find(attr => attr.getName == groundTruthAttribute).get.getType } - Schema - .builder() - .add(schemas(1)) - .add(resultAttribute, resultType) - .build() + Map( + operatorInfo.outputPorts.head.id -> Schema + .builder() + .add(inputSchema) + .add(resultAttribute, resultType) + .build() + ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sort/SortOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sort/SortOpDesc.scala index 39af6cd63a9..644ff0ff6cd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sort/SortOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sort/SortOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.sort import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} class SortOpDesc extends PythonOperatorDescriptor { @@ -40,6 +40,10 @@ class SortOpDesc extends PythonOperatorDescriptor { | yield sorted_df""".stripMargin } + def getOutputSchemas(inputSchemas: Map[PortIdentity, Schema]): Map[PortIdentity, Schema] = { + Map(operatorInfo.outputPorts.head.id -> inputSchemas.values.head) + } + override def operatorInfo: OperatorInfo = OperatorInfo( "Sort", @@ -49,5 +53,4 @@ class SortOpDesc extends PythonOperatorDescriptor { outputPorts = List(OutputPort(blocking = true)) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = schemas(0) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala index 6c06d95dadc..3d4809e34f5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpDesc.scala @@ -1,10 +1,8 @@ package edu.uci.ics.amber.operator.sortPartitions import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalOp, RangePartition} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -68,9 +66,4 @@ class SortPartitionsOpDesc extends LogicalOp { inputPorts = List(InputPort()), outputPorts = List(OutputPort(blocking = true)) ) - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.length == 1) - schemas(0) - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/SourceOperatorDescriptor.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/SourceOperatorDescriptor.scala index ad36af84dbb..de87829cd38 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/SourceOperatorDescriptor.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/SourceOperatorDescriptor.scala @@ -1,15 +1,9 @@ package edu.uci.ics.amber.operator.source -import com.google.common.base.Preconditions import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.operator.LogicalOp abstract class SourceOperatorDescriptor extends LogicalOp { def sourceSchema(): Schema - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.isEmpty) - sourceSchema() - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala index bf2c9336d83..6213cc26ded 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala @@ -5,7 +5,7 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.PythonSourceOperatorDescriptor -import edu.uci.ics.amber.core.workflow.OutputPort +import edu.uci.ics.amber.core.workflow.{OutputPort, PortIdentity} class RedditSearchSourceOpDesc extends PythonSourceOperatorDescriptor { @JsonProperty(required = true) @@ -134,4 +134,8 @@ class RedditSearchSourceOpDesc extends PythonSourceOperatorDescriptor { new Attribute("subreddit", AttributeType.STRING) ) .build() + + def getOutputSchemas(inputSchemas: Map[PortIdentity, Schema]): Map[PortIdentity, Schema] = { + Map(operatorInfo.outputPorts.head.id -> sourceSchema()) + } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala index 134af1029cd..217bb161b2f 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/split/SplitOpDesc.scala @@ -3,13 +3,11 @@ package edu.uci.ics.amber.operator.split import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.google.common.base.Preconditions import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow._ import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.util.JSONUtils.objectMapper -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import scala.util.Random class SplitOpDesc extends LogicalOp { @@ -40,12 +38,11 @@ class SplitOpDesc extends LogicalOp { .withOutputPorts(operatorInfo.outputPorts) .withParallelizable(false) .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - operatorInfo.outputPorts - .map(_.id) - .map(id => id -> inputSchemas(operatorInfo.inputPorts.head.id)) - .toMap - ) + SchemaPropagationFunc(inputSchemas => { + Preconditions.checkArgument(inputSchemas.size == 1) + val outputSchema = inputSchemas.values.head + operatorInfo.outputPorts.map(port => port.id -> outputSchema).toMap + }) ) } @@ -64,10 +61,4 @@ class SplitOpDesc extends LogicalOp { ) } - override def getOutputSchema(schemas: Array[Schema]): Schema = throw new NotImplementedError() - - override def getOutputSchemas(schemas: Array[Schema]): Array[Schema] = { - Preconditions.checkArgument(schemas.length == 1) - Array(schemas(0), schemas(0)) - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala index e77663fdf0b..3dc311f77d5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpDesc.scala @@ -2,12 +2,17 @@ package edu.uci.ics.amber.operator.symmetricDifference import com.google.common.base.Preconditions import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.{HashPartition, PhysicalOp} +import edu.uci.ics.amber.core.workflow.{ + HashPartition, + InputPort, + OutputPort, + PhysicalOp, + PortIdentity, + SchemaPropagationFunc +} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class SymmetricDifferenceOpDesc extends LogicalOp { @@ -29,6 +34,11 @@ class SymmetricDifferenceOpDesc extends LogicalOp { .withOutputPorts(operatorInfo.outputPorts) .withPartitionRequirement(List(Option(HashPartition()), Option(HashPartition()))) .withDerivePartition(_ => HashPartition(List())) + .withPropagateSchema(SchemaPropagationFunc(inputSchemas => { + Preconditions.checkArgument(inputSchemas.values.toSet.size == 1) + val outputSchema = inputSchemas.values.head + operatorInfo.outputPorts.map(port => port.id -> outputSchema).toMap + })) } override def operatorInfo: OperatorInfo = @@ -40,9 +50,4 @@ class SymmetricDifferenceOpDesc extends LogicalOp { outputPorts = List(OutputPort(blocking = true)) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.forall(_ == schemas(0))) - schemas(0) - } - } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala index b52f299c0ff..f5d502437b4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpDesc.scala @@ -54,10 +54,4 @@ class TypeCastingOpDesc extends MapOpDesc { List(OutputPort()) ) } - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - typeCastingUnits.foldLeft(schemas.head) { (schema, unit) => - AttributeTypeUtils.SchemaCasting(schema, unit.attribute, unit.resultType) - } - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala index 9fe0089c4ba..fd38d176ae1 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala @@ -157,26 +157,6 @@ class JavaUDFOpDesc extends LogicalOp { ) } - override def getOutputSchema(schemas: Array[Schema]): Schema = { - // Preconditions.checkArgument(schemas.length == 1) - val inputSchema = schemas(0) - val outputSchemaBuilder = Schema.Builder() - // keep the same schema from input - if (retainInputColumns) outputSchemaBuilder.add(inputSchema) - // for any javaUDFType, it can add custom output columns (attributes). - if (outputColumns != null) { - if (retainInputColumns) { // check if columns are duplicated - - for (column <- outputColumns) { - if (inputSchema.containsAttribute(column.getName)) - throw new RuntimeException("Column name " + column.getName + " already exists!") - } - } - outputSchemaBuilder.add(outputColumns) - } - outputSchemaBuilder.build() - } - override def runtimeReconfiguration( workflowId: WorkflowIdentity, executionId: ExecutionIdentity, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala index 985fa54fede..a4af16c5415 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala @@ -64,7 +64,7 @@ class DualInputPortsPythonUDFOpDescV2 extends LogicalOp { executionId: ExecutionIdentity ): PhysicalOp = { Preconditions.checkArgument(workers >= 1, "Need at least 1 worker.", Array()) - if (workers > 1) { + val physicalOp = if (workers > 1) { PhysicalOp .oneToOnePhysicalOp( workflowId, @@ -72,15 +72,7 @@ class DualInputPortsPythonUDFOpDescV2 extends LogicalOp { operatorIdentifier, OpExecWithCode(code, "python") ) - .withDerivePartition(_ => UnknownPartition()) .withParallelizable(true) - .withInputPorts(operatorInfo.inputPorts) - .withOutputPorts(operatorInfo.outputPorts) - .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map(operatorInfo.outputPorts.head.id -> getOutputSchema(inputSchemas.values.toArray)) - ) - ) .withSuggestedWorkerNum(workers) } else { PhysicalOp @@ -90,20 +82,33 @@ class DualInputPortsPythonUDFOpDescV2 extends LogicalOp { operatorIdentifier, OpExecWithCode(code, "python") ) - .withDerivePartition(_ => UnknownPartition()) .withParallelizable(false) - .withInputPorts(operatorInfo.inputPorts) - .withOutputPorts(operatorInfo.outputPorts) - .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - operatorInfo.outputPorts.head.id -> getOutputSchema( - operatorInfo.inputPorts.map(_.id).map(inputSchemas(_)).toArray - ) - ) - ) - ) } + physicalOp + .withDerivePartition(_ => UnknownPartition()) + .withInputPorts(operatorInfo.inputPorts) + .withOutputPorts(operatorInfo.outputPorts) + .withPropagateSchema( + SchemaPropagationFunc(inputSchemas => { + Preconditions.checkArgument(inputSchemas.size == 2) + val inputSchema = inputSchemas(operatorInfo.inputPorts(1).id) + val outputSchemaBuilder = Schema.builder() + // keep the same schema from input + if (retainInputColumns) outputSchemaBuilder.add(inputSchema) + // for any pythonUDFType, it can add custom output columns (attributes). + if (outputColumns != null) { + if (retainInputColumns) { // check if columns are duplicated + + for (column <- outputColumns) { + if (inputSchema.containsAttribute(column.getName)) + throw new RuntimeException("Column name " + column.getName + " already exists!") + } + } + outputSchemaBuilder.add(outputColumns).build() + } + Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) + }) + ) } override def operatorInfo: OperatorInfo = @@ -123,23 +128,4 @@ class DualInputPortsPythonUDFOpDescV2 extends LogicalOp { outputPorts = List(OutputPort()) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.length == 2) - val inputSchema = schemas(1) - val outputSchemaBuilder = Schema.builder() - // keep the same schema from input - if (retainInputColumns) outputSchemaBuilder.add(inputSchema) - // for any pythonUDFType, it can add custom output columns (attributes). - if (outputColumns != null) { - if (retainInputColumns) { // check if columns are duplicated - - for (column <- outputColumns) { - if (inputSchema.containsAttribute(column.getName)) - throw new RuntimeException("Column name " + column.getName + " already exists!") - } - } - outputSchemaBuilder.add(outputColumns).build() - } - outputSchemaBuilder.build() - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala index aa016c2740e..056326c8093 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala @@ -5,16 +5,18 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{AttributeTypeUtils, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class PythonLambdaFunctionOpDesc extends PythonOperatorDescriptor { @JsonSchemaTitle("Add/Modify column(s)") var lambdaAttributeUnits: List[LambdaAttributeUnit] = List() - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.length == 1) + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + Preconditions.checkArgument(inputSchemas.size == 1) Preconditions.checkArgument(lambdaAttributeUnits.nonEmpty) - val inputSchema = schemas(0) + val inputSchema = inputSchemas.values.head val outputSchemaBuilder = Schema.builder() // keep the same schema from input outputSchemaBuilder.add(inputSchema) @@ -37,7 +39,8 @@ class PythonLambdaFunctionOpDesc extends PythonOperatorDescriptor { outputSchema = AttributeTypeUtils.SchemaCasting(outputSchema, unit.attributeName, unit.attributeType) } - outputSchema + Map(operatorInfo.outputPorts.head.id -> outputSchema) + } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala index aa36eaced06..0f6d1988de5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala @@ -3,21 +3,23 @@ package edu.uci.ics.amber.operator.udf.python import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.Schema +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class PythonTableReducerOpDesc extends PythonOperatorDescriptor { @JsonSchemaTitle("Output columns") var lambdaAttributeUnits: List[LambdaAttributeUnit] = List() - override def getOutputSchema(schemas: Array[Schema]): Schema = { + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { Preconditions.checkArgument(lambdaAttributeUnits.nonEmpty) val outputSchemaBuilder = Schema.builder() for (unit <- lambdaAttributeUnits) { outputSchemaBuilder.add(unit.attributeName, unit.attributeType) } - outputSchemaBuilder.build() + Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) } override def operatorInfo: OperatorInfo = @@ -30,17 +32,17 @@ class PythonTableReducerOpDesc extends PythonOperatorDescriptor { ) override def generatePythonCode(): String = { - var outputTable = "{" - for (unit <- lambdaAttributeUnits) { - outputTable += s"""\"${unit.attributeName}\":${unit.expression},""" - } - outputTable += "}" + val outputTable = lambdaAttributeUnits + .map(unit => s"""\"${unit.attributeName}\": ${unit.expression}""") + .mkString("{", ", ", "}") + s""" -from pytexera import * -class ProcessTableOperator(UDFTableOperator): - @overrides - def process_table(self, table: Table, port: int) -> Iterator[Optional[TableLike]]: - yield $outputTable -""" + |from pytexera import * + |class ProcessTableOperator(UDFTableOperator): + | + | @overrides + | def process_table(self, table: Table, port: int) -> Iterator[Optional[TableLike]]: + | yield $outputTable + |""".stripMargin } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala index 1f9b69eb326..216284315cb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala @@ -5,16 +5,10 @@ import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.executor.OpExecWithCode import edu.uci.ics.amber.core.tuple.{Attribute, Schema} -import edu.uci.ics.amber.core.workflow.{ - PartitionInfo, - PhysicalOp, - SchemaPropagationFunc, - UnknownPartition -} -import edu.uci.ics.amber.operator.{LogicalOp, PortDescription, StateTransferFunc} -import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow._ +import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} +import edu.uci.ics.amber.operator.{LogicalOp, PortDescription, StateTransferFunc} import scala.util.{Success, Try} @@ -98,7 +92,7 @@ class PythonUDFOpDescV2 extends LogicalOp { Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) } - if (workers > 1) + if (workers > 1) { PhysicalOp .oneToOnePhysicalOp( workflowId, @@ -106,15 +100,9 @@ class PythonUDFOpDescV2 extends LogicalOp { operatorIdentifier, OpExecWithCode(code, "python") ) - .withDerivePartition(_ => UnknownPartition()) - .withInputPorts(operatorInfo.inputPorts) - .withOutputPorts(operatorInfo.outputPorts) - .withPartitionRequirement(partitionRequirement) - .withIsOneToManyOp(true) .withParallelizable(true) .withSuggestedWorkerNum(workers) - .withPropagateSchema(SchemaPropagationFunc(propagateSchema)) - else + } else { PhysicalOp .manyToOnePhysicalOp( workflowId, @@ -122,13 +110,13 @@ class PythonUDFOpDescV2 extends LogicalOp { operatorIdentifier, OpExecWithCode(code, "python") ) - .withDerivePartition(_ => UnknownPartition()) - .withInputPorts(operatorInfo.inputPorts) - .withOutputPorts(operatorInfo.outputPorts) - .withPartitionRequirement(partitionRequirement) - .withIsOneToManyOp(true) .withParallelizable(false) - .withPropagateSchema(SchemaPropagationFunc(propagateSchema)) + }.withDerivePartition(_ => UnknownPartition()) + .withInputPorts(operatorInfo.inputPorts) + .withOutputPorts(operatorInfo.outputPorts) + .withPartitionRequirement(partitionRequirement) + .withIsOneToManyOp(true) + .withPropagateSchema(SchemaPropagationFunc(propagateSchema)) } override def operatorInfo: OperatorInfo = { @@ -165,27 +153,6 @@ class PythonUDFOpDescV2 extends LogicalOp { allowPortCustomization = true ) } - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - // Preconditions.checkArgument(schemas.length == 1) - val inputSchema = schemas(0) - val outputSchemaBuilder = Schema.builder() - // keep the same schema from input - if (retainInputColumns) outputSchemaBuilder.add(inputSchema) - // for any pythonUDFType, it can add custom output columns (attributes). - if (outputColumns != null) { - if (retainInputColumns) { // check if columns are duplicated - - for (column <- outputColumns) { - if (inputSchema.containsAttribute(column.getName)) - throw new RuntimeException("Column name " + column.getName + " already exists!") - } - } - outputSchemaBuilder.add(outputColumns).build() - } - outputSchemaBuilder.build() - } - override def runtimeReconfiguration( workflowId: WorkflowIdentity, executionId: ExecutionIdentity, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala index 086b014ea68..a219ba2808a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala @@ -8,7 +8,7 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort class PythonUDFSourceOpDescV2 extends SourceOperatorDescriptor { @@ -41,18 +41,14 @@ class PythonUDFSourceOpDescV2 extends SourceOperatorDescriptor { executionId: ExecutionIdentity ): PhysicalOp = { require(workers >= 1, "Need at least 1 worker.") - - val func = SchemaPropagationFunc { _: Map[PortIdentity, Schema] => - val outputSchema = sourceSchema() - Map(operatorInfo.outputPorts.head.id -> outputSchema) - } - val physicalOp = PhysicalOp .sourcePhysicalOp(workflowId, executionId, operatorIdentifier, OpExecWithCode(code, "python")) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withIsOneToManyOp(true) - .withPropagateSchema(func) + .withPropagateSchema( + SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> sourceSchema())) + ) .withLocationPreference(Option.empty) if (workers > 1) { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala index 42445e21e16..bc9d6ec1b5e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala @@ -91,7 +91,7 @@ class RUDFOpDesc extends LogicalOp { } val r_operator_type = if (useTupleAPI) "r-tuple" else "r-table" - if (workers > 1) + if (workers > 1) { PhysicalOp .oneToOnePhysicalOp( workflowId, @@ -99,15 +99,9 @@ class RUDFOpDesc extends LogicalOp { operatorIdentifier, OpExecWithCode(code, r_operator_type) ) - .withDerivePartition(_ => UnknownPartition()) - .withInputPorts(operatorInfo.inputPorts) - .withOutputPorts(operatorInfo.outputPorts) - .withPartitionRequirement(partitionRequirement) - .withIsOneToManyOp(true) .withParallelizable(true) .withSuggestedWorkerNum(workers) - .withPropagateSchema(SchemaPropagationFunc(propagateSchema)) - else + } else { PhysicalOp .manyToOnePhysicalOp( workflowId, @@ -115,13 +109,14 @@ class RUDFOpDesc extends LogicalOp { operatorIdentifier, OpExecWithCode(code, r_operator_type) ) - .withDerivePartition(_ => UnknownPartition()) - .withInputPorts(operatorInfo.inputPorts) - .withOutputPorts(operatorInfo.outputPorts) - .withPartitionRequirement(partitionRequirement) - .withIsOneToManyOp(true) .withParallelizable(false) - .withPropagateSchema(SchemaPropagationFunc(propagateSchema)) + }.withDerivePartition(_ => UnknownPartition()) + .withInputPorts(operatorInfo.inputPorts) + .withOutputPorts(operatorInfo.outputPorts) + .withPartitionRequirement(partitionRequirement) + .withIsOneToManyOp(true) + .withPropagateSchema(SchemaPropagationFunc(propagateSchema)) + } override def operatorInfo: OperatorInfo = { @@ -151,32 +146,10 @@ class RUDFOpDesc extends LogicalOp { "User-defined function operator in R script", OperatorGroupConstants.R_GROUP, inputPortInfo, - outputPortInfo, - dynamicInputPorts = false, - dynamicOutputPorts = false, - supportReconfiguration = false, - allowPortCustomization = false + outputPortInfo ) } - override def getOutputSchema(schemas: Array[Schema]): Schema = { - val inputSchema = schemas(0) - val outputSchemaBuilder = Schema.Builder() - // keep the same schema from input - if (retainInputColumns) outputSchemaBuilder.add(inputSchema) - if (outputColumns != null) { - if (retainInputColumns) { // check if columns are duplicated - - for (column <- outputColumns) { - if (inputSchema.containsAttribute(column.getName)) - throw new RuntimeException("Column name " + column.getName + " already exists!") - } - } - outputSchemaBuilder.add(outputColumns) - } - outputSchemaBuilder.build() - } - override def runtimeReconfiguration( workflowId: WorkflowIdentity, executionId: ExecutionIdentity, diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala index 0653228a145..19f65d42c0d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala @@ -8,7 +8,7 @@ import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.SourceOperatorDescriptor import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{OutputPort, PortIdentity} +import edu.uci.ics.amber.core.workflow.OutputPort class RUDFSourceOpDesc extends SourceOperatorDescriptor { @@ -51,11 +51,6 @@ class RUDFSourceOpDesc extends SourceOperatorDescriptor { val rOperatorType = if (useTupleAPI) "r-tuple" else "r-table" require(workers >= 1, "Need at least 1 worker.") - val func = SchemaPropagationFunc { _: Map[PortIdentity, Schema] => - val outputSchema = sourceSchema() - Map(operatorInfo.outputPorts.head.id -> outputSchema) - } - val physicalOp = PhysicalOp .sourcePhysicalOp( workflowId, @@ -66,7 +61,9 @@ class RUDFSourceOpDesc extends SourceOperatorDescriptor { .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withIsOneToManyOp(true) - .withPropagateSchema(func) + .withPropagateSchema( + SchemaPropagationFunc(_ => Map(operatorInfo.outputPorts.head.id -> sourceSchema())) + ) .withLocationPreference(None) if (workers > 1) { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala index 7e75c24e7f6..6ee4fea20d9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/union/UnionOpDesc.scala @@ -1,13 +1,10 @@ package edu.uci.ics.amber.operator.union -import com.google.common.base.Preconditions import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.Schema -import edu.uci.ics.amber.core.workflow.PhysicalOp +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalOp, PortIdentity} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class UnionOpDesc extends LogicalOp { @@ -34,10 +31,4 @@ class UnionOpDesc extends LogicalOp { inputPorts = List(InputPort(PortIdentity(0), allowMultiLinks = true)), outputPorts = List(OutputPort()) ) - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.forall(_ == schemas(0))) - schemas(0) - } - } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala index 5ac736490da..2eb0fefa152 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala @@ -1,7 +1,6 @@ package edu.uci.ics.amber.operator.unneststring import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import com.google.common.base.Preconditions import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} @@ -53,19 +52,17 @@ class UnnestStringOpDesc extends FlatMapOpDesc { .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - operatorInfo.outputPorts.head.id -> getOutputSchema( - operatorInfo.inputPorts.map(_.id).map(inputSchemas(_)).toArray - ) - ) - ) + SchemaPropagationFunc(inputSchemas => { + val outputSchema = + if (resultAttribute == null || resultAttribute.trim.isEmpty) null + else + Schema + .builder() + .add(inputSchemas.values.head) + .add(resultAttribute, AttributeType.STRING) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) + }) ) } - - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Preconditions.checkArgument(schemas.forall(_ == schemas(0))) - if (resultAttribute == null || resultAttribute.trim.isEmpty) return null - Schema.builder().add(schemas(0)).add(resultAttribute, AttributeType.STRING).build() - } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala index c6e6a46d34a..ff082be7b3d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala @@ -5,7 +5,7 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -17,8 +17,14 @@ class DotPlotOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var countAttribute: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala index 7034e9f6bb9..16e682b4163 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala @@ -8,7 +8,7 @@ import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.visualization.hierarchychart.HierarchySection import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} // type constraint: value can only be numeric @JsonSchemaInject(json = """ @@ -34,8 +34,14 @@ class IcicleChartOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var value: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala index a1389c426d8..5e85d1979b2 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala @@ -4,7 +4,7 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.PythonOperatorDescriptor @@ -16,8 +16,14 @@ class ImageVisualizerOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var binaryContent: String = _ - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala index ea84feee197..4b6a366d7c4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.visualization.ScatterMatrixChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, @@ -34,8 +34,14 @@ class ScatterMatrixChartOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var color: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala index e3a8705cab0..c3924b3275d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala @@ -3,11 +3,11 @@ package edu.uci.ics.amber.operator.visualization.barChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName -import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} //type constraint: value can only be numeric @JsonSchemaInject(json = """ @@ -50,8 +50,14 @@ class BarChartOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var pattern: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala index 064992b96d5..5df97e865a1 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala @@ -4,7 +4,7 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.PythonOperatorDescriptor @@ -37,8 +37,14 @@ class BoxPlotOpDesc extends PythonOperatorDescriptor { ) var quertiletype: BoxPlotQuartileFunction = _ - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala index aa589a33c24..3a4db9d8e91 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala @@ -5,7 +5,7 @@ import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -43,8 +43,14 @@ class BubbleChartOpDesc extends PythonOperatorDescriptor { @JsonPropertyDescription("Picks data column to color bubbles with if color is enabled") @AutofillAttributeName var colorCategory: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala index a344e3fb6d6..80ee1ff31e1 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class CandlestickChartOpDesc extends PythonOperatorDescriptor { @@ -41,8 +41,14 @@ class CandlestickChartOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var close: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala index e81d1d87f78..78d818cc161 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import java.util import scala.jdk.CollectionConverters.ListHasAsScala @@ -25,8 +25,14 @@ class ContinuousErrorBandsOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "bands", required = true) var bands: util.List[BandConfig] = _ - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala index 14001e6d3ec..0a132c2c996 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class ContourPlotOpDesc extends PythonOperatorDescriptor { @@ -46,8 +46,14 @@ class ContourPlotOpDesc extends PythonOperatorDescriptor { ) var coloringMethod: ContourPlotColoringFunction = _ - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala index eb933ce627d..bac0482bf8a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala @@ -4,7 +4,7 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -59,8 +59,14 @@ class DumbbellPlotOpDesc extends PythonOperatorDescriptor { @JsonPropertyDescription("whether show legends in the graph") var showLegends: Boolean = false; - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala index 6c3d93b6f2e..32c250b55dd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class FigureFactoryTableOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = false) @@ -104,7 +104,13 @@ class FigureFactoryTableOpDesc extends PythonOperatorDescriptor { ) } - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala index 106f424bbc1..2e4e0691a08 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala @@ -4,7 +4,7 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @@ -46,8 +46,14 @@ class FilledAreaPlotOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var pattern: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala index c9a32bc8044..a7e8075edff 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} @JsonSchemaInject(json = """ { "attributeTypeRules": { @@ -35,8 +35,14 @@ class FunnelPlotOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var color: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala index 2a34113e9fb..382035b3d64 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala @@ -4,7 +4,7 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @@ -53,8 +53,14 @@ class GanttChartOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var pattern: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala index 8c11038d25e..3b623fbccc3 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class HeatMapOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "x", required = true) @@ -28,8 +28,14 @@ class HeatMapOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var value: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala index b23f7b511c2..3e09d51484c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala @@ -3,11 +3,10 @@ package edu.uci.ics.amber.operator.visualization.hierarchychart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} - import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.PythonOperatorDescriptor // type constraint: value can only be numeric @JsonSchemaInject(json = """ @@ -38,8 +37,14 @@ class HierarchyChartOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var value: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala index 044dad14065..829f5355224 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala @@ -4,7 +4,7 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.PythonOperatorDescriptor @@ -94,8 +94,14 @@ class HistogramChartOpDesc extends PythonOperatorDescriptor { finalCode } - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala index 2bf48a41ca8..5d84d7e548b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala @@ -4,14 +4,13 @@ import com.fasterxml.jackson.annotation.JsonProperty import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.util.JSONUtils.objectMapper import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} /** * HTML Visualization operator to render any given HTML code @@ -39,13 +38,13 @@ class HtmlVizOpDesc extends LogicalOp { .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - operatorInfo.outputPorts.head.id -> getOutputSchema( - operatorInfo.inputPorts.map(_.id).map(inputSchemas(_)).toArray - ) - ) - ) + SchemaPropagationFunc(inputSchemas => { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) + }) ) } @@ -58,6 +57,4 @@ class HtmlVizOpDesc extends LogicalOp { outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala index e7deebe579d..69eb7f83d12 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala @@ -6,7 +6,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import java.util import scala.jdk.CollectionConverters.ListHasAsScala @@ -26,8 +26,14 @@ class LineChartOpDesc extends PythonOperatorDescriptor { @JsonProperty(value = "lines", required = true) var lines: util.List[LineConfig] = _ - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala index 923ca5a619a..5ff0bfa88ae 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala @@ -5,7 +5,7 @@ import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchema import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -33,8 +33,14 @@ class PieChartOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var name: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala index 054d02b8090..58f8759594b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala @@ -5,7 +5,7 @@ import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchema import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -42,8 +42,14 @@ class QuiverPlotOpDesc extends PythonOperatorDescriptor { @JsonPropertyDescription("column for the vector component in the y-direction") @AutofillAttributeName var v: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala index f00c164743b..ca8cff0cdcb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class SankeyDiagramOpDesc extends PythonOperatorDescriptor { @@ -29,8 +29,14 @@ class SankeyDiagramOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var valueAttribute: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala index d0e71870398..2a42c2c84dc 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.visualization.scatter3DChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.PythonOperatorDescriptor @@ -34,8 +34,14 @@ class Scatter3dChartOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var z: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala index 8195441602b..d56a2c45d1d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala @@ -4,7 +4,7 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @@ -60,8 +60,14 @@ class ScatterplotOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var hoverName: String = "" - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala index 941d681a6ed..87d174d01e9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala @@ -5,7 +5,7 @@ import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class TablesPlotOpDesc extends PythonOperatorDescriptor { @JsonPropertyDescription("List of columns to include in the table chart") @@ -80,7 +80,13 @@ class TablesPlotOpDesc extends PythonOperatorDescriptor { ) } - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala index 91db1d5e1b2..2840ea421da 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} /** * Visualization Operator for Ternary Plots. @@ -57,9 +57,14 @@ class TernaryPlotOpDesc extends PythonOperatorDescriptor { outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) - /** Returns the output schema set as html-content */ - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } /** Returns a Python string that drops any tuples with missing values */ diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala index 9df368b0ec9..90482deaaa5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala @@ -4,10 +4,9 @@ import com.fasterxml.jackson.annotation.JsonProperty import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.util.JSONUtils.objectMapper @@ -50,13 +49,13 @@ class UrlVizOpDesc extends LogicalOp { .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( - SchemaPropagationFunc(inputSchemas => - Map( - operatorInfo.outputPorts.head.id -> getOutputSchema( - operatorInfo.inputPorts.map(_.id).map(inputSchemas(_)).toArray - ) - ) - ) + SchemaPropagationFunc(_ => { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) + }) ) } @@ -69,7 +68,4 @@ class UrlVizOpDesc extends LogicalOp { outputPorts = List(OutputPort(mode = OutputMode.SINGLE_SNAPSHOT)) ) - override def getOutputSchema(schemas: Array[Schema]): Schema = - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() - } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala index 15bee2d2506..2ba19165765 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala @@ -7,7 +7,7 @@ import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class WaterfallChartOpDesc extends PythonOperatorDescriptor { @@ -23,8 +23,14 @@ class WaterfallChartOpDesc extends PythonOperatorDescriptor { @AutofillAttributeName var yColumn: String = _ - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala index 516bc3ab3b4..e6e2c408e48 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala @@ -12,7 +12,7 @@ import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.visualization.ImageUtility import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class WordCloudOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("Text column") @@ -24,8 +24,14 @@ class WordCloudOpDesc extends PythonOperatorDescriptor { @JsonSchemaInject(ints = Array(new JsonSchemaInt(path = "exclusiveMinimum", value = 0))) var topN: Integer = 100 - override def getOutputSchema(schemas: Array[Schema]): Schema = { - Schema.builder().add(new Attribute("html-content", AttributeType.STRING)).build() + override def getOutputSchemas( + inputSchemas: Map[PortIdentity, Schema] + ): Map[PortIdentity, Schema] = { + val outputSchema = Schema + .builder() + .add(new Attribute("html-content", AttributeType.STRING)) + .build() + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpExecSpec.scala index a913ff15576..60725d53295 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpExecSpec.scala @@ -8,6 +8,7 @@ import edu.uci.ics.amber.core.tuple.{ Tuple, TupleLike } +import edu.uci.ics.amber.core.workflow.PortIdentity import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec @@ -102,8 +103,8 @@ class CartesianProductOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .add(generate_schema("right", numRightSchemaAttributes - 1)) .add(duplicateAttribute) .build() - val inputSchemas = Array(leftSchema, rightSchema) - val outputSchema = opDesc.getOutputSchema(inputSchemas) + val inputSchemas = Map(PortIdentity() -> leftSchema, PortIdentity(1) -> rightSchema) + val outputSchema = opDesc.getExternalOutputSchemas(inputSchemas).values.head // verify output schema is as expected & has no duplicates assert( diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala index f952d847e7f..1d19700e071 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.dictionary import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema, SchemaEnforceable, Tuple} +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec @@ -35,7 +36,7 @@ class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { opDesc.dictionary = dictionaryScan opDesc.resultAttribute = "matched" opDesc.matchingType = MatchingType.SCANBASED - outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) + outputSchema = opDesc.getExternalOutputSchemas(Map(PortIdentity() -> tupleSchema)).values.head } it should "open" in { diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala index 2049d89f7f4..c3cd8f6ecd3 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala @@ -11,6 +11,7 @@ import edu.uci.ics.amber.core.tuple.{ Tuple, TupleLike } +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.amber.operator.hashJoin.HashJoinBuildOpExec import edu.uci.ics.amber.util.JSONUtils.objectMapper class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { @@ -53,8 +54,8 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { opDesc.buildAttributeName = "build_1" opDesc.probeAttributeName = "probe_1" opDesc.joinType = JoinType.INNER - val inputSchemas = Array(schema("build"), schema("probe")) - val outputSchema = opDesc.getOutputSchema(inputSchemas) + val inputSchemas = Map(PortIdentity() -> schema("build"), PortIdentity(1) -> schema("probe")) + val outputSchema = opDesc.getExternalOutputSchemas(inputSchemas).values.head buildOpExec = new HashJoinBuildOpExec[String](objectMapper.writeValueAsString(opDesc)) buildOpExec.open() @@ -79,7 +80,7 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { buildOpOutputIterator .next() .asInstanceOf[SchemaEnforceable] - .enforceSchema(getInternalHashTableSchema(inputSchemas.head)), + .enforceSchema(getInternalHashTableSchema(inputSchemas.head._2)), build ) .isEmpty @@ -109,8 +110,9 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { opDesc.buildAttributeName = "same" opDesc.probeAttributeName = "same" opDesc.joinType = JoinType.INNER - val inputSchemas = Array(schema("same", 1), schema("same", 2)) - val outputSchema = opDesc.getOutputSchema(inputSchemas) + val inputSchemas = + Map(PortIdentity() -> schema("same", 1), PortIdentity(1) -> schema("same", 2)) + val outputSchema = opDesc.getExternalOutputSchemas(inputSchemas).values.head buildOpExec = new HashJoinBuildOpExec[String](objectMapper.writeValueAsString(opDesc)) buildOpExec.open() @@ -134,7 +136,7 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { buildOpOutputIterator .next() .asInstanceOf[SchemaEnforceable] - .enforceSchema(getInternalHashTableSchema(inputSchemas.head)), + .enforceSchema(getInternalHashTableSchema(inputSchemas.head._2)), build ) .isEmpty @@ -163,8 +165,9 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { opDesc.buildAttributeName = "same" opDesc.probeAttributeName = "same" opDesc.joinType = JoinType.FULL_OUTER - val inputSchemas = Array(schema("same", 1), schema("same", 2)) - val outputSchema = opDesc.getOutputSchema(inputSchemas) + val inputSchemas = + Map(PortIdentity() -> schema("same", 1), PortIdentity(1) -> schema("same", 2)) + val outputSchema = opDesc.getExternalOutputSchemas(inputSchemas).values.head buildOpExec = new HashJoinBuildOpExec[String](objectMapper.writeValueAsString(opDesc)) buildOpExec.open() @@ -188,7 +191,7 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { buildOpOutputIterator .next() .asInstanceOf[SchemaEnforceable] - .enforceSchema(getInternalHashTableSchema(inputSchemas.head)), + .enforceSchema(getInternalHashTableSchema(inputSchemas.head._2)), build ) .isEmpty diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala index 72c062ed319..e8c26d84a23 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala @@ -213,7 +213,10 @@ class IntervalOpExecSpec extends AnyFlatSpec with BeforeAndAfter { rightInput: Array[T] ): Unit = { val inputSchemas = - Array(schema(leftKey, dataType), schema(rightKey, dataType)) + Map( + PortIdentity() -> schema(leftKey, dataType), + PortIdentity(1) -> schema(rightKey, dataType) + ) opDesc = new IntervalJoinOpDesc( leftKey, rightKey, @@ -222,7 +225,7 @@ class IntervalOpExecSpec extends AnyFlatSpec with BeforeAndAfter { includeRightBound, timeIntervalType ) - val outputSchema = opDesc.getOutputSchema(inputSchemas) + val outputSchema = opDesc.getExternalOutputSchemas(inputSchemas).values.head val opExec = new IntervalJoinOpExec(objectMapper.writeValueAsString(opDesc)) opExec.open() counter = 0 diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDescSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDescSpec.scala index b23aeb484ea..32f3cb5a86b 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDescSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDescSpec.scala @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.projection import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.workflow.PortIdentity import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class ProjectionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { @@ -30,7 +31,8 @@ class ProjectionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { new AttributeUnit("field1", "f1"), new AttributeUnit("field2", "f2") ) - val outputSchema = projectionOpDesc.getOutputSchema(Array(schema)) + val outputSchema = + projectionOpDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head assert(outputSchema.getAttributes.length == 2) } @@ -40,7 +42,8 @@ class ProjectionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { new AttributeUnit("field2", "f2"), new AttributeUnit("field1", "f1") ) - val outputSchema = projectionOpDesc.getOutputSchema(Array(schema)) + val outputSchema = + projectionOpDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head assert(outputSchema.getAttributes.length == 2) assert(outputSchema.getIndex("f2") == 0) assert(outputSchema.getIndex("f1") == 1) @@ -53,7 +56,7 @@ class ProjectionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { new AttributeUnit("field---6", "f6") ) assertThrows[RuntimeException] { - projectionOpDesc.getOutputSchema(Array(schema)) + projectionOpDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head } } @@ -61,20 +64,7 @@ class ProjectionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { it should "raise IllegalArgumentException on empty attributes" in { assertThrows[IllegalArgumentException] { - projectionOpDesc.getOutputSchema(Array(schema)) - } - - } - - it should "raise IllegalArgumentException with multiple input source Schema" in { - - projectionOpDesc.attributes ++= List( - new AttributeUnit("field2", "f2"), - new AttributeUnit("field1", "f1") - ) - - assertThrows[IllegalArgumentException] { - projectionOpDesc.getOutputSchema(Array(schema, schema)) + projectionOpDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head } } @@ -86,7 +76,7 @@ class ProjectionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { new AttributeUnit("field1", "f") ) assertThrows[RuntimeException] { - projectionOpDesc.getOutputSchema(Array(schema)) + projectionOpDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head } } @@ -95,7 +85,8 @@ class ProjectionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { new AttributeUnit("field1", "f1"), new AttributeUnit("field2", "") ) - val outputSchema = projectionOpDesc.getOutputSchema(Array(schema)) + val outputSchema = + projectionOpDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head assert(outputSchema.getAttributes.length == 2) } diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala index bf1a6122e1c..55a09751ffe 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDescSpec.scala @@ -4,7 +4,6 @@ import edu.uci.ics.amber.core.storage.FileResolver import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.WorkflowContext.{DEFAULT_EXECUTION_ID, DEFAULT_WORKFLOW_ID} import edu.uci.ics.amber.operator.TestOperators -import edu.uci.ics.amber.core.workflow.PortIdentity import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec @@ -14,11 +13,7 @@ class CSVScanSourceOpDescSpec extends AnyFlatSpec with BeforeAndAfter { var parallelCsvScanSourceOpDesc: ParallelCSVScanSourceOpDesc = _ before { csvScanSourceOpDesc = new CSVScanSourceOpDesc() - csvScanSourceOpDesc.outputPortToSchemaMapping(PortIdentity()) = - csvScanSourceOpDesc.getOutputSchema(Array()) parallelCsvScanSourceOpDesc = new ParallelCSVScanSourceOpDesc() - parallelCsvScanSourceOpDesc.outputPortToSchemaMapping(PortIdentity()) = - parallelCsvScanSourceOpDesc.getOutputSchema(Array()) } it should "infer schema from single-line-data csv" in { diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDescSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDescSpec.scala index 147384f71e4..190da462e99 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDescSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDescSpec.scala @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.udf.python import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.workflow.PortIdentity import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class PythonLambdaFunctionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { @@ -25,7 +26,7 @@ class PythonLambdaFunctionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { AttributeType.STRING ) ) - val outputSchema = opDesc.getOutputSchema(Array(schema)) + val outputSchema = opDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head assert(outputSchema.getAttributes.length == 4) } @@ -44,7 +45,7 @@ class PythonLambdaFunctionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { AttributeType.INTEGER ) ) - val outputSchema = opDesc.getOutputSchema(Array(schema)) + val outputSchema = opDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head assert(outputSchema.getAttributes.length == 5) } @@ -57,7 +58,7 @@ class PythonLambdaFunctionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { AttributeType.STRING ) ) - val outputSchema = opDesc.getOutputSchema(Array(schema)) + val outputSchema = opDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head assert(outputSchema.getAttributes.length == 3) } @@ -72,7 +73,7 @@ class PythonLambdaFunctionOpDescSpec extends AnyFlatSpec with BeforeAndAfter { ) assertThrows[RuntimeException] { - opDesc.getOutputSchema(Array(schema)) + opDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head } } diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala index 8ce75b6fd5f..d63b82900bb 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala @@ -28,15 +28,13 @@ class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { opDesc.attribute = "field1" opDesc.delimiter = "-" opDesc.resultAttribute = "split" - opDesc.inputPortToSchemaMapping(PortIdentity()) = tupleSchema - opDesc.outputPortToSchemaMapping(PortIdentity()) = opDesc.getOutputSchema(Array(tupleSchema)) } it should "open" in { opDesc.attribute = "field1" opDesc.delimiter = "-" opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) - outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) + outputSchema = opDesc.getExternalOutputSchemas(Map(PortIdentity() -> tupleSchema)).values.head opExec.open() assert(opExec.flatMapFunc != null) } @@ -45,7 +43,7 @@ class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { opDesc.attribute = "field1" opDesc.delimiter = "-" opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) - outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) + outputSchema = opDesc.getExternalOutputSchemas(Map(PortIdentity() -> tupleSchema)).values.head opExec.open() val processedTuple = opExec .processTuple(tuple, 0) @@ -61,7 +59,7 @@ class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { opDesc.attribute = "field3" opDesc.delimiter = "-" opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) - outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) + outputSchema = opDesc.getExternalOutputSchemas(Map(PortIdentity() -> tupleSchema)).values.head opExec.open() val processedTuple = opExec .processTuple(tuple, 0) @@ -75,7 +73,7 @@ class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { opDesc.attribute = "field1" opDesc.delimiter = "/" opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) - outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) + outputSchema = opDesc.getExternalOutputSchemas(Map(PortIdentity() -> tupleSchema)).values.head val tuple: Tuple = Tuple .builder(tupleSchema) .add(new Attribute("field1", AttributeType.STRING), "//a//b/") @@ -97,7 +95,7 @@ class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { opDesc.attribute = "field1" opDesc.delimiter = "<\\d*>" opExec = new UnnestStringOpExec(objectMapper.writeValueAsString(opDesc)) - outputSchema = opDesc.getOutputSchema(Array(tupleSchema)) + outputSchema = opDesc.getExternalOutputSchemas(Map(PortIdentity() -> tupleSchema)).values.head val tuple: Tuple = Tuple .builder(tupleSchema) .add(new Attribute("field1", AttributeType.STRING), "<>a<1>b<12>") diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExecSpec.scala index f8aa526a0c9..b636703c461 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpExecSpec.scala @@ -1,6 +1,7 @@ package edu.uci.ics.amber.operator.visualization.htmlviz import edu.uci.ics.amber.core.tuple._ +import edu.uci.ics.amber.core.workflow.PortIdentity import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec @@ -11,7 +12,8 @@ class HtmlVizOpExecSpec extends AnyFlatSpec with BeforeAndAfter { ) val opDesc: HtmlVizOpDesc = new HtmlVizOpDesc() - val outputSchema: Schema = opDesc.getOutputSchema(Array(schema)) + val outputSchema: Schema = + opDesc.getExternalOutputSchemas(Map(PortIdentity() -> schema)).values.head def tuple(): Tuple = Tuple From 2556432685df3c13c59e08e6cbae56d32400d062 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Tue, 31 Dec 2024 15:34:17 -0800 Subject: [PATCH 25/47] Add single schema per port validation during compilation (#3187) Each port must have exactly one schema. If multiple links are connected to the same port, they are required to share the same schema. This PR introduces a validation step during schema propagation to ensure this constraint is enforced as part of the compilation process. For example, consider a Union operator with a single input port that supports multiple links. If upstream operators produce differing output schemas, the validation will fail with an appropriate error message: ![CleanShot 2024-12-31 at 14 56 36](https://github.com/user-attachments/assets/077594b4-26fb-4983-9ce4-d7c67365645e) --- .../uci/ics/amber/core/workflow/PhysicalOp.scala | 14 ++++++++++++-- 1 file changed, 12 insertions(+), 2 deletions(-) diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala index d493f0891a5..00f244d6e15 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/workflow/PhysicalOp.scala @@ -398,8 +398,18 @@ case class PhysicalOp( */ def propagateSchema(newInputSchema: Option[(PortIdentity, Schema)] = None): PhysicalOp = { // Update the input schema if a new one is provided - val updatedOp = newInputSchema.foldLeft(this) { - case (op, (portId, schema)) => op.withInputSchema(portId, Right(schema)) + val updatedOp = newInputSchema.foldLeft(this) { (op, schemaEntry) => + val (portId, schema) = schemaEntry + op.inputPorts(portId)._3 match { + case Left(_) => + op.withInputSchema(portId, Right(schema)) + case Right(existingSchema) if existingSchema != schema => + throw new IllegalArgumentException( + s"Conflict schemas received on port ${portId.id}, $existingSchema != $schema" + ) + case _ => + op + } } // Extract input schemas, checking if all are defined From bf6ffc9bea1562d60e1723407d9bdb45fda868ce Mon Sep 17 00:00:00 2001 From: Xiaozhen Liu Date: Wed, 1 Jan 2025 18:57:03 +0800 Subject: [PATCH 26/47] Add Cost Estimator Using Past Statistics for Schedule Generator (#3156) #### This PR introduces the `CostEstimator` trait which estimates the cost of a region, given some resource units. - The cost estimator is used by `CostBasedScheduleGenerator` to calculate the cost of a schedule during search. - Currently we only consider one type of schedule for each region plan, which is a total order of the regions. The cost of the schedule (and also the cost of the region plan) is thus the summation of the cost of each region. - The resource units are currently passed as placeholders because we assume a region will have all the resources when doing the estimation. The units may be used in the future if we consider different methods of schedule-generation. For example, if we allow two regions to run concurrently, the units will be split in half for each region. #### A `DefaultCostEstimator` implementation is also added, which uses past execution statistics to estimate the wall-clock runtime of a region: - The runtime of each region is represented by the runtime of its longest-running operator. - The runtime of operators are estimated using the statistics from the **latest successful execution** of the workflow. - If such statistics do not exist (e.g., if it is the first execution, or if past executions all failed), we fall back to using number of materialized edges as the cost. - Added test cases using mock mysql data. --- .../CostBasedScheduleGenerator.scala | 42 ++- .../scheduling/CostEstimator.scala | 148 ++++++++++ .../scheduling/DefaultCostEstimatorSpec.scala | 258 ++++++++++++++++++ 3 files changed, 435 insertions(+), 13 deletions(-) create mode 100644 core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostEstimator.scala create mode 100644 core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/DefaultCostEstimatorSpec.scala diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGenerator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGenerator.scala index 457bd15e6ab..eb848c1d8f1 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGenerator.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostBasedScheduleGenerator.scala @@ -34,6 +34,9 @@ class CostBasedScheduleGenerator( numStatesExplored: Int = 0 ) + private val costEstimator = + new DefaultCostEstimator(workflowContext = workflowContext, actorId = actorId) + def generate(): (Schedule, PhysicalPlan) = { val startTime = System.nanoTime() val regionDAG = createRegionDAG() @@ -281,7 +284,9 @@ class CostBasedScheduleGenerator( if (oEarlyStop) schedulableStates.add(currentState) // Calculate the current state's cost and update the bestResult if it's lower val cost = - evaluate(regionDAG.vertexSet().asScala.toSet, regionDAG.edgeSet().asScala.toSet) + evaluate( + RegionPlan(regionDAG.vertexSet().asScala.toSet, regionDAG.edgeSet().asScala.toSet) + ) if (cost < bestResult.cost) { bestResult = SearchResult(currentState, regionDAG, cost) } @@ -334,7 +339,12 @@ class CostBasedScheduleGenerator( physicalPlan.getNonMaterializedBlockingAndDependeeLinks ++ neighborState ) match { case Left(regionDAG) => - evaluate(regionDAG.vertexSet().asScala.toSet, regionDAG.edgeSet().asScala.toSet) + evaluate( + RegionPlan( + regionDAG.vertexSet().asScala.toSet, + regionDAG.edgeSet().asScala.toSet + ) + ) case Right(_) => Double.MaxValue } @@ -423,7 +433,9 @@ class CostBasedScheduleGenerator( def updateOptimumIfApplicable(regionDAG: DirectedAcyclicGraph[Region, RegionLink]): Unit = { // Calculate the current state's cost and update the bestResult if it's lower val cost = - evaluate(regionDAG.vertexSet().asScala.toSet, regionDAG.edgeSet().asScala.toSet) + evaluate( + RegionPlan(regionDAG.vertexSet().asScala.toSet, regionDAG.edgeSet().asScala.toSet) + ) if (cost < bestResult.cost) { bestResult = SearchResult(currentState, regionDAG, cost) } @@ -453,7 +465,12 @@ class CostBasedScheduleGenerator( physicalPlan.getNonMaterializedBlockingAndDependeeLinks ++ neighborState ) match { case Left(regionDAG) => - evaluate(regionDAG.vertexSet().asScala.toSet, regionDAG.edgeSet().asScala.toSet) + evaluate( + RegionPlan( + regionDAG.vertexSet().asScala.toSet, + regionDAG.edgeSet().asScala.toSet + ) + ) case Right(_) => Double.MaxValue } @@ -472,17 +489,16 @@ class CostBasedScheduleGenerator( } /** - * The cost function used by the search. Takes in a region graph represented as set of regions and links. + * The cost function used by the search. Takes a region plan, generates one or more (to be done in the future) + * schedules based on the region plan, and calculates the cost of the schedule(s) using Cost Estimator. Uses the cost + * of the best schedule (currently only considers one schedule) as the cost of the region plan. * - * @param regions A set of regions created based on a search state. - * @param regionLinks A set of links to indicate dependencies between regions, based on the materialization edges. - * @return A cost determined by the resource allocator. + * @return A cost determined by the cost estimator. */ - private def evaluate(regions: Set[Region], regionLinks: Set[RegionLink]): Double = { - // Using number of materialized ports as the cost. - // This is independent of the schedule / resource allocator. - // In the future we may need to use the ResourceAllocator to get the cost. - regions.flatMap(_.materializedPortIds).size + private def evaluate(regionPlan: RegionPlan): Double = { + val schedule = generateScheduleFromRegionPlan(regionPlan) + // In the future we may allow multiple regions in a level and split the resources. + schedule.map(level => level.map(region => costEstimator.estimate(region, 1)).sum).sum } } diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostEstimator.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostEstimator.scala new file mode 100644 index 00000000000..c675d44f154 --- /dev/null +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/scheduling/CostEstimator.scala @@ -0,0 +1,148 @@ +package edu.uci.ics.amber.engine.architecture.scheduling + +import edu.uci.ics.amber.core.storage.StorageConfig +import edu.uci.ics.amber.core.workflow.WorkflowContext +import edu.uci.ics.amber.engine.architecture.scheduling.DefaultCostEstimator.DEFAULT_OPERATOR_COST +import edu.uci.ics.amber.engine.common.AmberLogging +import edu.uci.ics.amber.core.virtualidentity.ActorVirtualIdentity +import edu.uci.ics.texera.dao.SqlServer +import edu.uci.ics.texera.dao.SqlServer.withTransaction +import edu.uci.ics.texera.dao.jooq.generated.Tables.{ + WORKFLOW_EXECUTIONS, + WORKFLOW_RUNTIME_STATISTICS, + WORKFLOW_VERSION +} +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos.WorkflowRuntimeStatistics +import org.jooq.types.UInteger + +import scala.jdk.CollectionConverters.ListHasAsScala +import scala.util.{Failure, Success, Try} + +/** + * A cost estimator should estimate a cost of running a region under the given resource constraints as units. + */ +trait CostEstimator { + def estimate(region: Region, resourceUnits: Int): Double +} + +object DefaultCostEstimator { + val DEFAULT_OPERATOR_COST: Double = 1.0 +} + +/** + * A default cost estimator using past statistics. If past statistics of a workflow are available, the cost of a region + * is the execution time of its longest-running operator. Otherwise the cost is the number of materialized ports in the + * region. + */ +class DefaultCostEstimator( + workflowContext: WorkflowContext, + val actorId: ActorVirtualIdentity +) extends CostEstimator + with AmberLogging { + + // Requires mysql database to retrieve execution statistics, otherwise use number of materialized ports as a default. + private val operatorEstimatedTimeOption = Try( + this.getOperatorExecutionTimeInSeconds( + this.workflowContext.workflowId.id + ) + ) match { + case Failure(_) => None + case Success(result) => result + } + + operatorEstimatedTimeOption match { + case None => + logger.info( + s"WID: ${workflowContext.workflowId.id}, EID: ${workflowContext.executionId.id}, " + + s"no past execution statistics available. Using number of materialized output ports as the cost. " + ) + case Some(_) => + } + + override def estimate(region: Region, resourceUnits: Int): Double = { + this.operatorEstimatedTimeOption match { + case Some(operatorEstimatedTime) => + // Use past statistics (wall-clock runtime). We use the execution time of the longest-running + // operator in each region to represent the region's execution time, and use the sum of all the regions' + // execution time as the wall-clock runtime of the workflow. + // This assumes a schedule is a total-order of the regions. + val opExecutionTimes = region.getOperators.map(op => { + operatorEstimatedTime.getOrElse(op.id.logicalOpId.id, DEFAULT_OPERATOR_COST) + }) + val longestRunningOpExecutionTime = opExecutionTimes.max + longestRunningOpExecutionTime + case None => + // Without past statistics (e.g., first execution), we use number of materialized ports as the cost. + // This is independent of the schedule / resource allocator. + region.materializedPortIds.size + } + } + + /** + * Retrieve the latest successful execution to get statistics to calculate costs in DefaultCostEstimator. + * Using the total control processing time plus data processing time of an operator as its cost. + * If no past statistics are available (e.g., first execution), return None. + */ + private def getOperatorExecutionTimeInSeconds( + wid: Long + ): Option[Map[String, Double]] = { + + val operatorEstimatedTimeOption = withTransaction( + SqlServer + .getInstance( + StorageConfig.jdbcUrl, + StorageConfig.jdbcUsername, + StorageConfig.jdbcPassword + ) + .createDSLContext() + ) { context => + val widAsUInteger = UInteger.valueOf(wid) + val rawStats = context + .select( + WORKFLOW_RUNTIME_STATISTICS.OPERATOR_ID, + WORKFLOW_RUNTIME_STATISTICS.TIME, + WORKFLOW_RUNTIME_STATISTICS.DATA_PROCESSING_TIME, + WORKFLOW_RUNTIME_STATISTICS.CONTROL_PROCESSING_TIME, + WORKFLOW_RUNTIME_STATISTICS.EXECUTION_ID + ) + .from(WORKFLOW_RUNTIME_STATISTICS) + .where( + WORKFLOW_RUNTIME_STATISTICS.WORKFLOW_ID + .eq(widAsUInteger) + .and( + WORKFLOW_RUNTIME_STATISTICS.EXECUTION_ID.eq( + context + .select( + WORKFLOW_EXECUTIONS.EID + ) + .from(WORKFLOW_EXECUTIONS) + .join(WORKFLOW_VERSION) + .on(WORKFLOW_VERSION.VID.eq(WORKFLOW_EXECUTIONS.VID)) + .where( + WORKFLOW_VERSION.WID + .eq(widAsUInteger) + .and(WORKFLOW_EXECUTIONS.STATUS.eq(3.toByte)) + ) + .orderBy(WORKFLOW_EXECUTIONS.STARTING_TIME.desc()) + .limit(1) + ) + ) + ) + .orderBy(WORKFLOW_RUNTIME_STATISTICS.TIME, WORKFLOW_RUNTIME_STATISTICS.OPERATOR_ID) + .fetchInto(classOf[WorkflowRuntimeStatistics]) + .asScala + .toList + if (rawStats.isEmpty) { + None + } else { + val cumulatedStats = rawStats.foldLeft(Map.empty[String, Double]) { (acc, stat) => + val opTotalExecutionTime = acc.getOrElse(stat.getOperatorId, 0.0) + acc + (stat.getOperatorId -> (opTotalExecutionTime + (stat.getDataProcessingTime + .doubleValue() + stat.getControlProcessingTime.doubleValue()) / 1e9)) + } + Some(cumulatedStats) + } + } + operatorEstimatedTimeOption + } +} diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/DefaultCostEstimatorSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/DefaultCostEstimatorSpec.scala new file mode 100644 index 00000000000..636a82d7dc8 --- /dev/null +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/scheduling/DefaultCostEstimatorSpec.scala @@ -0,0 +1,258 @@ +package edu.uci.ics.amber.engine.architecture.scheduling + +import edu.uci.ics.amber.core.workflow.{PortIdentity, WorkflowContext} +import edu.uci.ics.amber.engine.common.virtualidentity.util.CONTROLLER +import edu.uci.ics.amber.engine.e2e.TestUtils.buildWorkflow +import edu.uci.ics.amber.operator.TestOperators +import edu.uci.ics.amber.operator.aggregate.{AggregateOpDesc, AggregationFunction} +import edu.uci.ics.amber.operator.keywordSearch.KeywordSearchOpDesc +import edu.uci.ics.amber.operator.source.scan.csv.CSVScanSourceOpDesc +import edu.uci.ics.texera.dao.MockTexeraDB +import edu.uci.ics.texera.dao.jooq.generated.enums.UserRole +import edu.uci.ics.texera.dao.jooq.generated.tables.daos._ +import edu.uci.ics.texera.dao.jooq.generated.tables.pojos._ +import edu.uci.ics.texera.workflow.LogicalLink +import org.jooq.types.{UInteger, ULong} +import org.scalatest.flatspec.AnyFlatSpec +import org.scalatest.{BeforeAndAfterAll, BeforeAndAfterEach} + +import scala.jdk.CollectionConverters.CollectionHasAsScala + +class DefaultCostEstimatorSpec + extends AnyFlatSpec + with BeforeAndAfterAll + with BeforeAndAfterEach + with MockTexeraDB { + + private val headerlessCsvOpDesc: CSVScanSourceOpDesc = + TestOperators.headerlessSmallCsvScanOpDesc() + private val keywordOpDesc: KeywordSearchOpDesc = + TestOperators.keywordSearchOpDesc("column-1", "Asia") + private val groupByOpDesc: AggregateOpDesc = + TestOperators.aggregateAndGroupByDesc("column-1", AggregationFunction.COUNT, List[String]()) + + private val testUser: User = { + val user = new User + user.setUid(UInteger.valueOf(1)) + user.setName("test_user") + user.setRole(UserRole.ADMIN) + user.setPassword("123") + user.setEmail("test_user@test.com") + user + } + + private val testWorkflowEntry: Workflow = { + val workflow = new Workflow + workflow.setName("test workflow") + workflow.setWid(UInteger.valueOf(1)) + workflow.setContent("test workflow content") + workflow.setDescription("test description") + workflow + } + + private val testWorkflowVersionEntry: WorkflowVersion = { + val workflowVersion = new WorkflowVersion + workflowVersion.setWid(UInteger.valueOf(1)) + workflowVersion.setVid(UInteger.valueOf(1)) + workflowVersion.setContent("test version content") + workflowVersion + } + + private val testWorkflowExecutionEntry: WorkflowExecutions = { + val workflowExecution = new WorkflowExecutions + workflowExecution.setEid(UInteger.valueOf(1)) + workflowExecution.setVid(UInteger.valueOf(1)) + workflowExecution.setUid(UInteger.valueOf(1)) + workflowExecution.setStatus(3.toByte) + workflowExecution.setEnvironmentVersion("test engine") + workflowExecution + } + + private val headerlessCsvOpStatisticsEntry: WorkflowRuntimeStatistics = { + val workflowRuntimeStatistics = new WorkflowRuntimeStatistics + workflowRuntimeStatistics.setOperatorId(headerlessCsvOpDesc.operatorIdentifier.id) + workflowRuntimeStatistics.setWorkflowId(UInteger.valueOf(1)) + workflowRuntimeStatistics.setExecutionId(UInteger.valueOf(1)) + workflowRuntimeStatistics.setDataProcessingTime(ULong.valueOf(100)) + workflowRuntimeStatistics.setControlProcessingTime(ULong.valueOf(100)) + workflowRuntimeStatistics + } + + private val keywordOpDescStatisticsEntry: WorkflowRuntimeStatistics = { + val workflowRuntimeStatistics = new WorkflowRuntimeStatistics + workflowRuntimeStatistics.setOperatorId(keywordOpDesc.operatorIdentifier.id) + workflowRuntimeStatistics.setWorkflowId(UInteger.valueOf(1)) + workflowRuntimeStatistics.setExecutionId(UInteger.valueOf(1)) + workflowRuntimeStatistics.setDataProcessingTime(ULong.valueOf(300)) + workflowRuntimeStatistics.setControlProcessingTime(ULong.valueOf(300)) + workflowRuntimeStatistics + } + + private val groupByOpDescStatisticsEntry: WorkflowRuntimeStatistics = { + val workflowRuntimeStatistics = new WorkflowRuntimeStatistics + workflowRuntimeStatistics.setOperatorId(groupByOpDesc.operatorIdentifier.id) + workflowRuntimeStatistics.setWorkflowId(UInteger.valueOf(1)) + workflowRuntimeStatistics.setExecutionId(UInteger.valueOf(1)) + workflowRuntimeStatistics.setDataProcessingTime(ULong.valueOf(1000)) + workflowRuntimeStatistics.setControlProcessingTime(ULong.valueOf(1000)) + workflowRuntimeStatistics + } + + override protected def beforeEach(): Unit = { + initializeDBAndReplaceDSLContext() + } + + "DefaultCostEstimator" should "use fallback method when no past statistics are available" in { + val workflow = buildWorkflow( + List(headerlessCsvOpDesc, keywordOpDesc), + List( + LogicalLink( + headerlessCsvOpDesc.operatorIdentifier, + PortIdentity(0), + keywordOpDesc.operatorIdentifier, + PortIdentity(0) + ) + ), + new WorkflowContext() + ) + + val costEstimator = new DefaultCostEstimator( + workflow.context, + CONTROLLER + ) + + val region = Region( + id = RegionIdentity(0), + physicalOps = workflow.physicalPlan.operators, + physicalLinks = workflow.physicalPlan.links + ) + + val costOfRegion = costEstimator.estimate(region, 1) + + assert(costOfRegion == 0) + } + + "DefaultCostEstimator" should "use the latest successful execution to estimate cost when available" in { + val workflow = buildWorkflow( + List(headerlessCsvOpDesc, keywordOpDesc), + List( + LogicalLink( + headerlessCsvOpDesc.operatorIdentifier, + PortIdentity(0), + keywordOpDesc.operatorIdentifier, + PortIdentity(0) + ) + ), + new WorkflowContext() + ) + + val userDao = new UserDao(getDSLContext.configuration()) + val workflowDao = new WorkflowDao(getDSLContext.configuration()) + val workflowExecutionsDao = new WorkflowExecutionsDao(getDSLContext.configuration()) + val workflowVersionDao = new WorkflowVersionDao(getDSLContext.configuration()) + val workflowRuntimeStatisticsDao = + new WorkflowRuntimeStatisticsDao(getDSLContext.configuration()) + + userDao.insert(testUser) + workflowDao.insert(testWorkflowEntry) + workflowVersionDao.insert(testWorkflowVersionEntry) + workflowExecutionsDao.insert(testWorkflowExecutionEntry) + workflowRuntimeStatisticsDao.insert(headerlessCsvOpStatisticsEntry) + workflowRuntimeStatisticsDao.insert(keywordOpDescStatisticsEntry) + + val costEstimator = new DefaultCostEstimator( + workflow.context, + CONTROLLER + ) + + val region = Region( + id = RegionIdentity(0), + physicalOps = workflow.physicalPlan.operators, + physicalLinks = workflow.physicalPlan.links + ) + + val costOfRegion = costEstimator.estimate(region, 1) + + assert(costOfRegion != 0) + } + + "DefaultCostEstimator" should "use correctly estimate costs in a search" in { + val workflow = buildWorkflow( + List(headerlessCsvOpDesc, groupByOpDesc, keywordOpDesc), + List( + LogicalLink( + headerlessCsvOpDesc.operatorIdentifier, + PortIdentity(0), + groupByOpDesc.operatorIdentifier, + PortIdentity(0) + ), + LogicalLink( + groupByOpDesc.operatorIdentifier, + PortIdentity(0), + keywordOpDesc.operatorIdentifier, + PortIdentity(0) + ) + ), + new WorkflowContext() + ) + + val userDao = new UserDao(getDSLContext.configuration()) + val workflowDao = new WorkflowDao(getDSLContext.configuration()) + val workflowExecutionsDao = new WorkflowExecutionsDao(getDSLContext.configuration()) + val workflowVersionDao = new WorkflowVersionDao(getDSLContext.configuration()) + val workflowRuntimeStatisticsDao = + new WorkflowRuntimeStatisticsDao(getDSLContext.configuration()) + + userDao.insert(testUser) + workflowDao.insert(testWorkflowEntry) + workflowVersionDao.insert(testWorkflowVersionEntry) + workflowExecutionsDao.insert(testWorkflowExecutionEntry) + workflowRuntimeStatisticsDao.insert(headerlessCsvOpStatisticsEntry) + workflowRuntimeStatisticsDao.insert(groupByOpDescStatisticsEntry) + workflowRuntimeStatisticsDao.insert(keywordOpDescStatisticsEntry) + + // Should contain two regions, one with CSV->localAgg->globalAgg, another with keyword->sink + val searchResult = new CostBasedScheduleGenerator( + workflow.context, + workflow.physicalPlan, + CONTROLLER + ).bottomUpSearch() + + val groupByRegion = + searchResult.regionDAG.vertexSet().asScala.filter(region => region.physicalOps.size == 3).head + val keywordRegion = + searchResult.regionDAG.vertexSet().asScala.filter(region => region.physicalOps.size == 2).head + + val costEstimator = new DefaultCostEstimator( + workflow.context, + CONTROLLER + ) + + val groupByRegionCost = costEstimator.estimate(groupByRegion, 1) + + val groupByOperatorCost = (groupByOpDescStatisticsEntry.getControlProcessingTime + .doubleValue() + groupByOpDescStatisticsEntry.getControlProcessingTime.doubleValue()) / 1e9 + + // The cost of the first region should be the cost of the GroupBy operator (note the two physical operators for + // the GroupBy logical operator have the same cost because we use logical operator in the statistics. + // The GroupBy operator has a longer running time. + assert(groupByRegionCost == groupByOperatorCost) + + val keywordRegionCost = costEstimator.estimate(keywordRegion, 1) + + val keywordOperatorCost = (keywordOpDescStatisticsEntry.getControlProcessingTime + .doubleValue() + keywordOpDescStatisticsEntry.getControlProcessingTime.doubleValue()) / 1e9 + + // The cost of the second region should be the cost of the keyword operator, since the sink operator has the same + // logical operator as the keyword operator. + assert(keywordRegionCost == keywordOperatorCost) + + // The cost of the region plan should be the sum of region costs + assert(searchResult.cost == groupByRegionCost + keywordRegionCost) + } + + override protected def afterEach(): Unit = { + shutdownDB() + } + +} From c2bef3abfd67cc7033ba2f6926aec3d4c0532b84 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Wed, 1 Jan 2025 10:26:42 -0800 Subject: [PATCH 27/47] Simplify schema build (#3188) To simplify schema creation, this PR removes the Schema.builder() pattern and makes Schema immutable. All modifications now result in the creation of a new Schema instance. --- .../NetworkInputGatewaySpec.scala | 6 +- .../messaginglayer/OutputManagerSpec.scala | 4 +- .../RangeBasedShuffleSpec.scala | 6 +- .../architecture/worker/DPThreadSpec.scala | 6 +- .../worker/DataProcessorSpec.scala | 2 +- .../architecture/worker/WorkerSpec.scala | 6 +- .../engine/faulttolerance/LoggingSpec.scala | 4 +- .../uci/ics/amber/core/marker/Marker.scala | 11 +- .../uci/ics/amber/core/tuple/Attribute.java | 2 +- .../amber/core/tuple/AttributeTypeUtils.scala | 27 +- .../edu/uci/ics/amber/core/tuple/Schema.scala | 253 ++++++++--------- .../uci/ics/amber/core/tuple/TupleUtils.scala | 12 +- .../edu/uci/ics/amber/util/ArrowUtils.scala | 12 +- .../uci/ics/amber/core/tuple/SchemaSpec.scala | 258 ++++++++++++++++++ .../uci/ics/amber/core/tuple/TupleSpec.scala | 25 +- .../operator/SpecialPhysicalOpFactory.scala | 6 +- .../operator/aggregate/AggregateOpDesc.scala | 9 +- .../CartesianProductOpDesc.scala | 9 +- .../dictionary/DictionaryMatcherOpDesc.scala | 10 +- .../operator/hashJoin/HashJoinOpDesc.scala | 53 ++-- ...gingFaceIrisLogisticRegressionOpDesc.scala | 5 +- .../HuggingFaceSentimentAnalysisOpDesc.scala | 5 +- .../HuggingFaceSpamSMSDetectionOpDesc.scala | 5 +- .../HuggingFaceTextSummarizationOpDesc.scala | 5 +- .../intervalJoin/IntervalJoinOpDesc.scala | 20 +- .../Scorer/MachineLearningScorerOpDesc.scala | 19 +- .../base/SklearnAdvancedBaseDesc.scala | 11 +- .../projection/ProjectionOpDesc.scala | 30 +- .../sentiment/SentimentAnalysisOpDesc.scala | 14 +- .../sklearn/SklearnClassifierOpDesc.scala | 4 +- .../SklearnLinearRegressionOpDesc.scala | 4 +- .../sklearn/SklearnPredictionOpDesc.scala | 5 +- .../reddit/RedditSearchSourceOpDesc.scala | 42 ++- ...TwitterFullArchiveSearchSourceOpDesc.scala | 75 +++-- .../v2/TwitterSearchSourceOpDesc.scala | 74 +++-- .../source/fetcher/URLFetcherOpDesc.scala | 10 +- .../source/scan/FileScanSourceOpDesc.scala | 10 +- .../source/scan/csv/CSVScanSourceOpDesc.scala | 9 +- .../csv/ParallelCSVScanSourceOpDesc.scala | 16 +- .../scan/csvOld/CSVOldScanSourceOpDesc.scala | 16 +- .../scan/json/JSONLScanSourceOpDesc.scala | 10 +- .../scan/text/TextInputSourceOpDesc.scala | 7 +- .../operator/source/sql/SQLSourceOpDesc.scala | 47 ++-- .../sql/asterixdb/AsterixDBSourceOpDesc.scala | 15 +- .../operator/udf/java/JavaUDFOpDesc.scala | 17 +- .../DualInputPortsPythonUDFOpDescV2.scala | 20 +- .../python/PythonLambdaFunctionOpDesc.scala | 26 +- .../udf/python/PythonTableReducerOpDesc.scala | 8 +- .../udf/python/PythonUDFOpDescV2.scala | 23 +- .../source/PythonUDFSourceOpDescV2.scala | 8 +- .../ics/amber/operator/udf/r/RUDFOpDesc.scala | 22 +- .../operator/udf/r/RUDFSourceOpDesc.scala | 8 +- .../unneststring/UnnestStringOpDesc.scala | 14 +- .../visualization/DotPlot/DotPlotOpDesc.scala | 9 +- .../IcicleChart/IcicleChartOpDesc.scala | 9 +- .../ImageViz/ImageVisualizerOpDesc.scala | 9 +- .../ScatterMatrixChartOpDesc.scala | 9 +- .../barChart/BarChartOpDesc.scala | 9 +- .../visualization/boxPlot/BoxPlotOpDesc.scala | 9 +- .../bubbleChart/BubbleChartOpDesc.scala | 9 +- .../CandlestickChartOpDesc.scala | 9 +- .../ContinuousErrorBandsOpDesc.scala | 9 +- .../contourPlot/ContourPlotOpDesc.scala | 9 +- .../dumbbellPlot/DumbbellPlotOpDesc.scala | 9 +- .../FigureFactoryTableOpDesc.scala | 9 +- .../filledAreaPlot/FilledAreaPlotOpDesc.scala | 9 +- .../funnelPlot/FunnelPlotOpDesc.scala | 9 +- .../ganttChart/GanttChartOpDesc.scala | 9 +- .../visualization/heatMap/HeatMapOpDesc.scala | 9 +- .../hierarchychart/HierarchyChartOpDesc.scala | 9 +- .../histogram/HistogramChartOpDesc.scala | 9 +- .../visualization/htmlviz/HtmlVizOpDesc.scala | 7 +- .../lineChart/LineChartOpDesc.scala | 9 +- .../pieChart/PieChartOpDesc.scala | 9 +- .../quiverPlot/QuiverPlotOpDesc.scala | 9 +- .../sankeyDiagram/SankeyDiagramOpDesc.scala | 9 +- .../scatter3DChart/Scatter3dChartOpDesc.scala | 9 +- .../scatterplot/ScatterplotOpDesc.scala | 9 +- .../tablesChart/TablesPlotOpDesc.scala | 9 +- .../ternaryPlot/TernaryPlotOpDesc.scala | 9 +- .../visualization/urlviz/UrlVizOpDesc.scala | 7 +- .../waterfallChart/WaterfallChartOpDesc.scala | 9 +- .../wordCloud/WordCloudOpDesc.scala | 13 +- .../CartesianProductOpExecSpec.scala | 12 +- .../DictionaryMatcherOpExecSpec.scala | 4 +- .../difference/DifferenceOpExecSpec.scala | 12 +- .../distinct/DistinctOpExecSpec.scala | 4 +- .../filter/SpecializedFilterOpExecSpec.scala | 6 +- .../operator/hashJoin/HashJoinOpSpec.scala | 16 +- .../intersect/IntersectOpExecSpec.scala | 8 +- .../intervalJoin/IntervalOpExecSpec.scala | 11 +- .../KeywordSearchOpExecSpec.scala | 4 +- .../projection/ProjectionOpExecSpec.scala | 16 +- .../SortPartitionsOpExecSpec.scala | 4 +- .../SymmetricDifferenceOpExecSpec.scala | 12 +- .../typecasting/TypeCastingOpExecSpec.scala | 9 +- .../unneststring/UnnestStringOpExecSpec.scala | 4 +- .../uci/ics/amber/util/ArrowUtilsSpec.scala | 4 +- 98 files changed, 892 insertions(+), 829 deletions(-) create mode 100644 core/workflow-core/src/test/scala/edu/uci/ics/amber/core/tuple/SchemaSpec.scala diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGatewaySpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGatewaySpec.scala index 2203ff7f2a9..c05fae417e4 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGatewaySpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/NetworkInputGatewaySpec.scala @@ -10,13 +10,11 @@ class NetworkInputGatewaySpec extends AnyFlatSpec with MockFactory { private val fakeReceiverID = ActorVirtualIdentity("testReceiver") private val fakeSenderID = ActorVirtualIdentity("testSender") - private val channelId = ChannelIdentity(fakeSenderID, fakeReceiverID, false) + private val channelId = ChannelIdentity(fakeSenderID, fakeReceiverID, isControl = false) private val payloads = (0 until 4).map { i => DataFrame( Array( - TupleLike(i) enforceSchema ( - Schema.builder().add("field1", AttributeType.INTEGER).build() - ) + TupleLike(i) enforceSchema Schema().add("field1", AttributeType.INTEGER) ) ) }.toArray diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManagerSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManagerSpec.scala index 916f6cf3e81..4fb4e45ce8f 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManagerSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/OutputManagerSpec.scala @@ -22,15 +22,13 @@ class OutputManagerSpec extends AnyFlatSpec with MockFactory { private val mockDataOutputPort = // scalafix:ok; need it for wiring purpose new NetworkOutputGateway(identifier, mockHandler) var counter: Int = 0 - val schema: Schema = Schema - .builder() + val schema: Schema = Schema() .add("field1", AttributeType.INTEGER) .add("field2", AttributeType.INTEGER) .add("field3", AttributeType.INTEGER) .add("field4", AttributeType.INTEGER) .add("field5", AttributeType.STRING) .add("field6", AttributeType.DOUBLE) - .build() def physicalOpId(): PhysicalOpIdentity = { counter += 1 diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/RangeBasedShuffleSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/RangeBasedShuffleSpec.scala index 59034227074..3c0bfbf6e8c 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/RangeBasedShuffleSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/messaginglayer/RangeBasedShuffleSpec.scala @@ -16,7 +16,7 @@ class RangeBasedShuffleSpec extends AnyFlatSpec with MockFactory { val fakeID5: ActorVirtualIdentity = ActorVirtualIdentity("rec5") val attr: Attribute = new Attribute("Attr1", AttributeType.INTEGER) - val schema: Schema = Schema.builder().add(attr).build() + val schema: Schema = Schema().add(attr) val partitioning: RangeBasedShufflePartitioning = RangeBasedShufflePartitioning( 400, @@ -82,7 +82,7 @@ class RangeBasedShuffleSpec extends AnyFlatSpec with MockFactory { val partitioner2: RangeBasedShufflePartitioner = RangeBasedShufflePartitioner(partitioning2) val doubleAttr: Attribute = new Attribute("Attr2", AttributeType.DOUBLE) - val doubleSchema: Schema = Schema.builder().add(doubleAttr).build() + val doubleSchema: Schema = Schema().add(doubleAttr) tuple = Tuple.builder(doubleSchema).add(doubleAttr, -90.5).build() idx = partitioner2.getBucketIndex(tuple) assert(idx.next() == 1) @@ -104,7 +104,7 @@ class RangeBasedShuffleSpec extends AnyFlatSpec with MockFactory { val partitioner3: RangeBasedShufflePartitioner = RangeBasedShufflePartitioner(partitioning3) val longAttr: Attribute = new Attribute("Attr3", AttributeType.LONG) - val longSchema: Schema = Schema.builder().add(longAttr).build() + val longSchema: Schema = Schema().add(longAttr) tuple = Tuple.builder(longSchema).add(longAttr, -90L).build() idx = partitioner3.getBucketIndex(tuple) assert(idx.next() == 1) diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DPThreadSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DPThreadSpec.scala index 8c8cedefcb7..ef33ecc4aaf 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DPThreadSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DPThreadSpec.scala @@ -35,7 +35,7 @@ class DPThreadSpec extends AnyFlatSpec with MockFactory { private val executor = mock[OperatorExecutor] private val mockInputPortId = PortIdentity() - private val schema: Schema = Schema.builder().add("field1", AttributeType.INTEGER).build() + private val schema: Schema = Schema().add("field1", AttributeType.INTEGER) private val tuples: Array[Tuple] = (0 until 5000) .map(i => TupleLike(i).enforceSchema(schema)) .toArray @@ -167,7 +167,7 @@ class DPThreadSpec extends AnyFlatSpec with MockFactory { } "DP Thread" should "write determinant logs to local storage while processing" in { - val dp = new DataProcessor(workerId, x => {}) + val dp = new DataProcessor(workerId, _ => {}) dp.executor = executor val inputQueue = new LinkedBlockingQueue[DPInputQueueElement]() val anotherSenderWorkerId = ActorVirtualIdentity("another") @@ -183,7 +183,7 @@ class DPThreadSpec extends AnyFlatSpec with MockFactory { ) logStorage.deleteStorage() val logManager: ReplayLogManager = - ReplayLogManager.createLogManager(logStorage, "tmpLog", x => {}) + ReplayLogManager.createLogManager(logStorage, "tmpLog", _ => {}) val dpThread = new DPThread(workerId, dp, logManager, inputQueue) dpThread.start() tuples.foreach { x => diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorSpec.scala index a3b62cfabca..2ef61da21e6 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/DataProcessorSpec.scala @@ -46,7 +46,7 @@ class DataProcessorSpec extends AnyFlatSpec with MockFactory with BeforeAndAfter private val outputPortId = PortIdentity() private val outputHandler = mock[Either[MainThreadDelegateMessage, WorkflowFIFOMessage] => Unit] private val adaptiveBatchingMonitor = mock[WorkerTimerService] - private val schema: Schema = Schema.builder().add("field1", AttributeType.INTEGER).build() + private val schema: Schema = Schema().add("field1", AttributeType.INTEGER) private val tuples: Array[Tuple] = (0 until 400) .map(i => TupleLike(i).enforceSchema(schema)) .toArray diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala index 4fc14016ac1..85ac8aa0fbd 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/architecture/worker/WorkerSpec.scala @@ -48,11 +48,11 @@ class WorkerSpec with MockFactory { def mkSchema(fields: Any*): Schema = { - val schemaBuilder = Schema.builder() + var schema = Schema() fields.indices.foreach { i => - schemaBuilder.add(new Attribute("field" + i, AttributeType.ANY)) + schema = schema.add(new Attribute("field" + i, AttributeType.ANY)) } - schemaBuilder.build() + schema } def mkTuple(fields: Any*): Tuple = { diff --git a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/LoggingSpec.scala b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/LoggingSpec.scala index fade388c4ff..a35a1d41d66 100644 --- a/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/LoggingSpec.scala +++ b/core/amber/src/test/scala/edu/uci/ics/amber/engine/faulttolerance/LoggingSpec.scala @@ -80,12 +80,10 @@ class LoggingSpec (0 to 400) .map(i => TupleLike(i, i.toString, i.toDouble).enforceSchema( - Schema - .builder() + Schema() .add("field1", AttributeType.INTEGER) .add("field2", AttributeType.STRING) .add("field3", AttributeType.DOUBLE) - .build() ) ) .toArray diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/marker/Marker.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/marker/Marker.scala index 3c201bf51d0..300816c9da4 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/marker/Marker.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/marker/Marker.scala @@ -31,13 +31,10 @@ final case class State(tuple: Option[Tuple] = None, passToAllDownstream: Boolean def toTuple: Tuple = Tuple .builder( - Schema - .builder() - .add(data.map { - case (name, (attrType, _)) => - new Attribute(name, attrType) - }) - .build() + Schema(data.map { + case (name, (attrType, _)) => + new Attribute(name, attrType) + }.toList) ) .addSequentially(data.values.map(_._2).toArray) .build() diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/Attribute.java b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/Attribute.java index 9c7661d514e..643a61d7b12 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/Attribute.java +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/Attribute.java @@ -38,7 +38,7 @@ public AttributeType getType() { @Override public String toString() { - return "edu.ics.uci.amber.model.tuple.model.Attribute[name=" + attributeName + ", type=" + attributeType + "]"; + return "Attribute[name=" + attributeName + ", type=" + attributeType + "]"; } @Override diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/AttributeTypeUtils.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/AttributeTypeUtils.scala index 1e333b8d7de..0a08b2883ee 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/AttributeTypeUtils.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/AttributeTypeUtils.scala @@ -11,39 +11,36 @@ import scala.util.control.Exception.allCatch object AttributeTypeUtils extends Serializable { /** - * this loop check whether the current attribute in the array is the attribute for casting, - * if it is, change it to result type - * if it's not, remain the same type - * we need this loop to keep the order the same as the original + * This function checks whether the current attribute in the schema matches the selected attribute for casting. + * If it matches, its type is changed to the specified result type. + * If it doesn't match, the original type is retained. + * The order of attributes in the schema is preserved. + * * @param schema schema of data * @param attribute selected attribute * @param resultType casting type - * @return schema of data + * @return a new schema with the modified attribute type */ def SchemaCasting( schema: Schema, attribute: String, resultType: AttributeType ): Schema = { - // need a builder to maintain the order of original schema - val builder = Schema.builder() - val attributes: List[Attribute] = schema.getAttributes - // change the schema when meet selected attribute else remain the same - for (i <- attributes.indices) { - if (attributes.apply(i).getName.equals(attribute)) { + val updatedAttributes = schema.getAttributes.map { attr => + if (attr.getName == attribute) { resultType match { case AttributeType.STRING | AttributeType.INTEGER | AttributeType.DOUBLE | AttributeType.LONG | AttributeType.BOOLEAN | AttributeType.TIMESTAMP | AttributeType.BINARY => - builder.add(attribute, resultType) + new Attribute(attribute, resultType) // Cast to the specified result type case AttributeType.ANY | _ => - builder.add(attribute, attributes.apply(i).getType) + attr // Retain the original type for unsupported types } } else { - builder.add(attributes.apply(i).getName, attributes.apply(i).getType) + attr // Retain attributes that don't match the target } } - builder.build() + Schema(updatedAttributes) } /** diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/Schema.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/Schema.scala index b85ac6dd82f..15f608fb808 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/Schema.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/Schema.scala @@ -4,27 +4,39 @@ import com.fasterxml.jackson.annotation.{JsonCreator, JsonIgnore, JsonProperty} import com.google.common.base.Preconditions.checkNotNull import scala.collection.immutable.ListMap -import scala.collection.mutable +/** + * Represents the schema of a tuple, consisting of a list of attributes. + * The schema is immutable, and any modifications result in a new Schema instance. + */ case class Schema @JsonCreator() ( - @JsonProperty(value = "attributes", required = true) attributes: List[Attribute] + @JsonProperty(value = "attributes", required = true) attributes: List[Attribute] = List() ) extends Serializable { checkNotNull(attributes) - val attributeIndex: Map[String, Int] = + // Maps attribute names (case-insensitive) to their indices in the schema. + private val attributeIndex: Map[String, Int] = attributes.view.map(_.getName.toLowerCase).zipWithIndex.toMap - def this(attrs: Attribute*) = { - this(attrs.toList) - } + def this(attrs: Attribute*) = this(attrs.toList) + /** + * Returns the list of attributes in the schema. + */ @JsonProperty(value = "attributes") def getAttributes: List[Attribute] = attributes + /** + * Returns a list of all attribute names in the schema. + */ @JsonIgnore def getAttributeNames: List[String] = attributes.map(_.getName) + /** + * Returns the index of a specified attribute by name. + * Throws an exception if the attribute is not found. + */ def getIndex(attributeName: String): Int = { if (!containsAttribute(attributeName)) { throw new RuntimeException(s"$attributeName is not contained in the schema") @@ -32,8 +44,14 @@ case class Schema @JsonCreator() ( attributeIndex(attributeName.toLowerCase) } + /** + * Retrieves an attribute by its name. + */ def getAttribute(attributeName: String): Attribute = attributes(getIndex(attributeName)) + /** + * Checks whether the schema contains an attribute with the specified name. + */ @JsonIgnore def containsAttribute(attributeName: String): Boolean = attributeIndex.contains(attributeName.toLowerCase) @@ -46,165 +64,122 @@ case class Schema @JsonCreator() ( result } - override def equals(obj: Any): Boolean = + override def equals(obj: Any): Boolean = { obj match { - case that: Schema => - this.attributes == that.attributes && this.attributeIndex == that.attributeIndex - case _ => false + case that: Schema => this.attributes == that.attributes + case _ => false } + } - override def toString: String = s"Schema[$attributes]" + override def toString: String = s"Schema[${attributes.map(_.toString).mkString(", ")}]" + /** + * Creates a new Schema containing only the specified attributes. + */ def getPartialSchema(attributeNames: List[String]): Schema = { Schema(attributeNames.map(name => getAttribute(name))) } /** - * This method converts to a Schema into a raw format, where each pair of attribute name and attribute type - * are represented as string. This is for serialization between languages. + * Converts the schema into a raw format where each attribute name + * and attribute type are represented as strings. Useful for serialization across languages. */ def toRawSchema: Map[String, String] = - getAttributes.foldLeft(ListMap[String, String]())((list, attr) => + attributes.foldLeft(ListMap[String, String]())((list, attr) => list + (attr.getName -> attr.getType.name()) ) -} - -object Schema { - def fromRawSchema(raw: Map[String, String]): Schema = { - Schema(raw.map { - case (name, attrType) => - new Attribute(name, AttributeType.valueOf(attrType)) - }.toList) - } - - def builder(): Builder = Builder() - - case class Builder(private var attributes: List[Attribute] = List.empty) { - private val attributeNames: mutable.Set[String] = mutable.Set.empty - - def add(attribute: Attribute): Builder = { - require(attribute != null, "edu.ics.uci.amber.model.tuple.model.Attribute cannot be null") - checkAttributeNotExists(attribute.getName) - attributes ::= attribute - attributeNames += attribute.getName.toLowerCase - this - } - - def add(attributeName: String, attributeType: AttributeType): Builder = { - add(new Attribute(attributeName, attributeType)) - this - } - - def add(attributes: Iterable[Attribute]): Builder = { - attributes.foreach(add) - this + /** + * Creates a new Schema by adding multiple attributes to the current schema. + * Throws an exception if any attribute name already exists in the schema. + */ + def add(attributesToAdd: Iterable[Attribute]): Schema = { + val existingNames = this.getAttributeNames.map(_.toLowerCase).toSet + val duplicateNames = attributesToAdd.map(_.getName.toLowerCase).toSet.intersect(existingNames) + + if (duplicateNames.nonEmpty) { + throw new RuntimeException( + s"Cannot add attributes with duplicate names: ${duplicateNames.mkString(", ")}" + ) } - def add(attributes: Attribute*): Builder = { - attributes.foreach(add) - this - } + val newAttributes = attributes ++ attributesToAdd + Schema(newAttributes) + } - def add(schema: Schema): Builder = { - checkNotNull(schema) - add(schema.getAttributes) - this - } + /** + * Creates a new Schema by adding multiple attributes. + * Accepts a variable number of `Attribute` arguments. + * Throws an exception if any attribute name already exists in the schema. + */ + def add(attributes: Attribute*): Schema = { + this.add(attributes) + } - def build(): Schema = Schema(attributes.reverse) - - /** - * Removes an attribute from the schema builder if it exists. - * - * @param attribute , the name of the attribute - * @return this Builder object - */ - def removeIfExists(attribute: String): Builder = { - checkNotNull(attribute) - attributes = attributes.filter((attr: Attribute) => !attr.getName.equalsIgnoreCase(attribute)) - attributeNames.remove(attribute.toLowerCase) - this + /** + * Creates a new Schema by adding a single attribute to the current schema. + * Throws an exception if the attribute name already exists in the schema. + */ + def add(attribute: Attribute): Schema = { + if (containsAttribute(attribute.getName)) { + throw new RuntimeException( + s"Attribute name '${attribute.getName}' already exists in the schema" + ) } + add(List(attribute)) + } - /** - * Removes the attributes from the schema builder if they exist. - * - * @param attributes , the names of the attributes - * @return this Builder object - */ - def removeIfExists(attributes: Iterable[String]): Builder = { - checkNotNull(attributes) - attributes.foreach((attr: String) => checkNotNull(attr)) - attributes.foreach((attr: String) => this.removeIfExists(attr)) - this - } + /** + * Creates a new Schema by adding an attribute with the specified name and type. + * Throws an exception if the attribute name already exists in the schema. + */ + def add(attributeName: String, attributeType: AttributeType): Schema = + add(new Attribute(attributeName, attributeType)) - /** - * Removes the attributes from the schema builder if they exist. - * - * @param attributes , the names of the attributes - * @return this Builder object - */ - def removeIfExists(attributes: String*): Builder = { - checkNotNull(attributes) - this.removeIfExists(attributes) - this - } + /** + * Creates a new Schema by merging it with another schema. + * Throws an exception if there are duplicate attribute names. + */ + def add(schema: Schema): Schema = { + add(schema.attributes) + } - /** - * Removes an attribute from the schema builder. - * Fails if the attribute does not exist. - * - * @param attribute , the name of the attribute - * @return this Builder object - */ - def remove(attribute: String): Builder = { - checkNotNull(attribute) - checkAttributeExists(attribute) - removeIfExists(attribute) - this + /** + * Creates a new Schema by removing attributes with the specified names. + * Throws an exception if any of the specified attributes do not exist in the schema. + */ + def remove(attributeNames: Iterable[String]): Schema = { + val attributesToRemove = attributeNames.map(_.toLowerCase).toSet + + // Check for non-existent attributes + val nonExistentAttributes = attributesToRemove.diff(attributes.map(_.getName.toLowerCase).toSet) + if (nonExistentAttributes.nonEmpty) { + throw new IllegalArgumentException( + s"Cannot remove non-existent attributes: ${nonExistentAttributes.mkString(", ")}" + ) } - /** - * Removes the attributes from the schema builder. - * Fails if an attributes does not exist. - */ - def remove(attributes: Iterable[String]): Builder = { - checkNotNull(attributes) - attributes.foreach(attrName => checkNotNull(attrName)) - attributes.foreach(this.checkAttributeExists) - this.removeIfExists(attributes) - this - } + val remainingAttributes = + attributes.filterNot(attr => attributesToRemove.contains(attr.getName.toLowerCase)) + Schema(remainingAttributes) + } - /** - * Removes the attributes from the schema builder. - * Fails if an attributes does not exist. - * - * @param attributes - * @return the builder itself - */ - def remove(attributes: String*): Builder = { - checkNotNull(attributes) - this.remove(attributes) - this - } + /** + * Creates a new Schema by removing a single attribute with the specified name. + */ + def remove(attributeName: String): Schema = remove(List(attributeName)) +} - private def checkAttributeNotExists(attributeName: String): Unit = { - if (attributeNames.contains(attributeName.toLowerCase)) { - throw new RuntimeException( - s"edu.ics.uci.amber.model.tuple.model.Attribute $attributeName already exists in the schema" - ) - } - } +object Schema { - private def checkAttributeExists(attributeName: String): Unit = { - if (!attributeNames.contains(attributeName.toLowerCase)) { - throw new RuntimeException( - s"edu.ics.uci.amber.model.tuple.model.Attribute $attributeName does not exist in the schema" - ) - } - } + /** + * Creates a Schema instance from a raw map representation. + * Each entry in the map contains an attribute name and its type as strings. + */ + def fromRawSchema(raw: Map[String, String]): Schema = { + Schema(raw.map { + case (name, attrType) => + new Attribute(name, AttributeType.valueOf(attrType)) + }.toList) } } diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/TupleUtils.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/TupleUtils.scala index c8735782748..9323f10a5ac 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/TupleUtils.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/core/tuple/TupleUtils.scala @@ -42,13 +42,11 @@ object TupleUtils { result.toArray })) - val schema = Schema - .builder() - .add( - sortedFieldNames.indices - .map(i => new Attribute(sortedFieldNames(i), attributeTypes(i))) - ) - .build() + val schema = Schema( + sortedFieldNames.indices + .map(i => new Attribute(sortedFieldNames(i), attributeTypes(i))) + .toList + ) try { val fields = scala.collection.mutable.ArrayBuffer.empty[Any] diff --git a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/ArrowUtils.scala b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/ArrowUtils.scala index ae205511d5b..19ac52b501d 100644 --- a/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/ArrowUtils.scala +++ b/core/workflow-core/src/main/scala/edu/uci/ics/amber/util/ArrowUtils.scala @@ -78,13 +78,11 @@ object ArrowUtils extends LazyLogging { * @return A Texera Schema. */ def toTexeraSchema(arrowSchema: org.apache.arrow.vector.types.pojo.Schema): Schema = - Schema - .builder() - .add( - arrowSchema.getFields.asScala - .map(field => new Attribute(field.getName, toAttributeType(field.getType))) - ) - .build() + Schema( + arrowSchema.getFields.asScala.map { field => + new Attribute(field.getName, toAttributeType(field.getType)) + }.toList + ) /** * Converts an ArrowType into an AttributeType. diff --git a/core/workflow-core/src/test/scala/edu/uci/ics/amber/core/tuple/SchemaSpec.scala b/core/workflow-core/src/test/scala/edu/uci/ics/amber/core/tuple/SchemaSpec.scala new file mode 100644 index 00000000000..827a0d3264a --- /dev/null +++ b/core/workflow-core/src/test/scala/edu/uci/ics/amber/core/tuple/SchemaSpec.scala @@ -0,0 +1,258 @@ +package edu.uci.ics.amber.core.tuple + +import org.scalatest.flatspec.AnyFlatSpec + +class SchemaSpec extends AnyFlatSpec { + + "Schema" should "create an empty schema" in { + val schema = Schema() + assert(schema.getAttributes.isEmpty) + assert(schema.getAttributeNames.isEmpty) + } + + it should "create a schema with attributes of all types" in { + val schema = Schema( + List( + new Attribute("stringAttr", AttributeType.STRING), + new Attribute("integerAttr", AttributeType.INTEGER), + new Attribute("longAttr", AttributeType.LONG), + new Attribute("doubleAttr", AttributeType.DOUBLE), + new Attribute("booleanAttr", AttributeType.BOOLEAN), + new Attribute("timestampAttr", AttributeType.TIMESTAMP), + new Attribute("binaryAttr", AttributeType.BINARY) + ) + ) + assert( + schema.getAttributes == List( + new Attribute("stringAttr", AttributeType.STRING), + new Attribute("integerAttr", AttributeType.INTEGER), + new Attribute("longAttr", AttributeType.LONG), + new Attribute("doubleAttr", AttributeType.DOUBLE), + new Attribute("booleanAttr", AttributeType.BOOLEAN), + new Attribute("timestampAttr", AttributeType.TIMESTAMP), + new Attribute("binaryAttr", AttributeType.BINARY) + ) + ) + assert( + schema.getAttributeNames == List( + "stringAttr", + "integerAttr", + "longAttr", + "doubleAttr", + "booleanAttr", + "timestampAttr", + "binaryAttr" + ) + ) + } + + it should "add a single attribute using add(Attribute)" in { + val schema = Schema() + val updatedSchema = schema.add(new Attribute("id", AttributeType.INTEGER)) + + assert(updatedSchema.getAttributes == List(new Attribute("id", AttributeType.INTEGER))) + } + + it should "add multiple attributes using add(Attribute*)" in { + val schema = Schema() + val updatedSchema = schema.add( + new Attribute("stringAttr", AttributeType.STRING), + new Attribute("integerAttr", AttributeType.INTEGER), + new Attribute("longAttr", AttributeType.LONG) + ) + + assert( + updatedSchema.getAttributes == List( + new Attribute("stringAttr", AttributeType.STRING), + new Attribute("integerAttr", AttributeType.INTEGER), + new Attribute("longAttr", AttributeType.LONG) + ) + ) + } + + it should "add attributes from another schema using add(Schema)" in { + val schema1 = Schema(List(new Attribute("id", AttributeType.INTEGER))) + val schema2 = Schema(List(new Attribute("name", AttributeType.STRING))) + + val mergedSchema = schema1.add(schema2) + + assert( + mergedSchema.getAttributes == List( + new Attribute("id", AttributeType.INTEGER), + new Attribute("name", AttributeType.STRING) + ) + ) + } + + it should "add an attribute with name and type using add(String, AttributeType)" in { + val schema = Schema() + val updatedSchema = schema.add("id", AttributeType.INTEGER) + + assert(updatedSchema.getAttributes == List(new Attribute("id", AttributeType.INTEGER))) + } + + it should "remove an existing attribute" in { + val schema = Schema( + List( + new Attribute("id", AttributeType.INTEGER), + new Attribute("name", AttributeType.STRING) + ) + ) + + val updatedSchema = schema.remove("id") + + assert(updatedSchema.getAttributes == List(new Attribute("name", AttributeType.STRING))) + } + + it should "throw an exception when removing a non-existent attribute" in { + val schema = Schema( + List(new Attribute("id", AttributeType.INTEGER)) + ) + + val exception = intercept[IllegalArgumentException] { + schema.remove("name") + } + assert(exception.getMessage == "Cannot remove non-existent attributes: name") + } + + it should "retrieve an attribute by name" in { + val schema = Schema( + List( + new Attribute("id", AttributeType.INTEGER), + new Attribute("name", AttributeType.STRING) + ) + ) + + val attribute = schema.getAttribute("id") + + assert(attribute == new Attribute("id", AttributeType.INTEGER)) + } + + it should "throw an exception when retrieving a non-existent attribute" in { + val schema = Schema(List(new Attribute("id", AttributeType.INTEGER))) + + val exception = intercept[RuntimeException] { + schema.getAttribute("name") + } + assert(exception.getMessage == "name is not contained in the schema") + } + + it should "return a partial schema for attributes of all types" in { + val schema = Schema( + List( + new Attribute("stringAttr", AttributeType.STRING), + new Attribute("integerAttr", AttributeType.INTEGER), + new Attribute("booleanAttr", AttributeType.BOOLEAN), + new Attribute("doubleAttr", AttributeType.DOUBLE) + ) + ) + + val partialSchema = schema.getPartialSchema(List("stringAttr", "booleanAttr")) + + assert( + partialSchema.getAttributes == List( + new Attribute("stringAttr", AttributeType.STRING), + new Attribute("booleanAttr", AttributeType.BOOLEAN) + ) + ) + } + + it should "convert to raw schema and back for attributes of all types" in { + val schema = Schema( + List( + new Attribute("stringAttr", AttributeType.STRING), + new Attribute("integerAttr", AttributeType.INTEGER), + new Attribute("longAttr", AttributeType.LONG), + new Attribute("doubleAttr", AttributeType.DOUBLE), + new Attribute("booleanAttr", AttributeType.BOOLEAN), + new Attribute("timestampAttr", AttributeType.TIMESTAMP), + new Attribute("binaryAttr", AttributeType.BINARY) + ) + ) + + val rawSchema = schema.toRawSchema + assert( + rawSchema == Map( + "stringAttr" -> "STRING", + "integerAttr" -> "INTEGER", + "longAttr" -> "LONG", + "doubleAttr" -> "DOUBLE", + "booleanAttr" -> "BOOLEAN", + "timestampAttr" -> "TIMESTAMP", + "binaryAttr" -> "BINARY" + ) + ) + + val reconstructedSchema = Schema.fromRawSchema(rawSchema) + assert(reconstructedSchema == schema) + } + + it should "check if attributes exist in schema" in { + val schema = Schema( + List( + new Attribute("stringAttr", AttributeType.STRING), + new Attribute("integerAttr", AttributeType.INTEGER) + ) + ) + + assert(schema.containsAttribute("stringAttr")) + assert(!schema.containsAttribute("nonExistentAttr")) + } + + it should "return the index of an attribute by name" in { + val schema = Schema( + List( + new Attribute("id", AttributeType.INTEGER), + new Attribute("name", AttributeType.STRING) + ) + ) + + assert(schema.getIndex("id") == 0) + assert(schema.getIndex("name") == 1) + } + + it should "throw an exception when getting the index of a non-existent attribute" in { + val schema = Schema(List(new Attribute("id", AttributeType.INTEGER))) + + val exception = intercept[RuntimeException] { + schema.getIndex("name") + } + assert(exception.getMessage == "name is not contained in the schema") + } + + it should "compare schemas for equality" in { + val schema1 = Schema( + List( + new Attribute("id", AttributeType.INTEGER), + new Attribute("name", AttributeType.STRING) + ) + ) + val schema2 = Schema( + List( + new Attribute("id", AttributeType.INTEGER), + new Attribute("name", AttributeType.STRING) + ) + ) + val schema3 = Schema( + List( + new Attribute("id", AttributeType.INTEGER) + ) + ) + + assert(schema1 == schema2) + assert(schema1 != schema3) + } + + it should "return a proper string representation" in { + val schema = Schema( + List( + new Attribute("id", AttributeType.INTEGER), + new Attribute("name", AttributeType.STRING) + ) + ) + + assert( + schema.toString == "Schema[Attribute[name=id, type=integer], Attribute[name=name, type=string]]" + ) + } +} diff --git a/core/workflow-core/src/test/scala/edu/uci/ics/amber/core/tuple/TupleSpec.scala b/core/workflow-core/src/test/scala/edu/uci/ics/amber/core/tuple/TupleSpec.scala index f5941c22c85..b5a3897df75 100644 --- a/core/workflow-core/src/test/scala/edu/uci/ics/amber/core/tuple/TupleSpec.scala +++ b/core/workflow-core/src/test/scala/edu/uci/ics/amber/core/tuple/TupleSpec.scala @@ -18,20 +18,20 @@ class TupleSpec extends AnyFlatSpec { it should "create a tuple with capitalized attributeName" in { - val schema = Schema.builder().add(capitalizedStringAttribute).build() + val schema = Schema().add(capitalizedStringAttribute) val tuple = Tuple.builder(schema).add(capitalizedStringAttribute, "string-value").build() assert(tuple.getField("COL-string").asInstanceOf[String] == "string-value") } it should "create a tuple with capitalized attributeName, using addSequentially" in { - val schema = Schema.builder().add(capitalizedStringAttribute).build() + val schema = Schema().add(capitalizedStringAttribute) val tuple = Tuple.builder(schema).addSequentially(Array("string-value")).build() assert(tuple.getField("COL-string").asInstanceOf[String] == "string-value") } it should "create a tuple using new builder, based on another tuple using old builder" in { - val schema = Schema.builder().add(stringAttribute).build() + val schema = Schema().add(stringAttribute) val inputTuple = Tuple.builder(schema).addSequentially(Array("string-value")).build() val newTuple = Tuple.builder(inputTuple.getSchema).add(inputTuple).build() @@ -39,22 +39,21 @@ class TupleSpec extends AnyFlatSpec { } it should "fail when unknown attribute is added to tuple" in { - val schema = Schema.builder().add(stringAttribute).build() + val schema = Schema().add(stringAttribute) assertThrows[TupleBuildingException] { Tuple.builder(schema).add(integerAttribute, 1) } } it should "fail when tuple does not conform to complete schema" in { - val schema = Schema.builder().add(stringAttribute).add(integerAttribute).build() + val schema = Schema().add(stringAttribute).add(integerAttribute) assertThrows[TupleBuildingException] { Tuple.builder(schema).add(integerAttribute, 1).build() } } it should "fail when entire tuple passed in has extra attributes" in { - val inputSchema = - Schema.builder().add(stringAttribute).add(integerAttribute).add(boolAttribute).build() + val inputSchema = Schema().add(stringAttribute).add(integerAttribute).add(boolAttribute) val inputTuple = Tuple .builder(inputSchema) .add(integerAttribute, 1) @@ -62,7 +61,7 @@ class TupleSpec extends AnyFlatSpec { .add(boolAttribute, true) .build() - val outputSchema = Schema.builder().add(stringAttribute).add(integerAttribute).build() + val outputSchema = Schema().add(stringAttribute).add(integerAttribute) assertThrows[TupleBuildingException] { Tuple.builder(outputSchema).add(inputTuple).build() } @@ -70,7 +69,7 @@ class TupleSpec extends AnyFlatSpec { it should "not fail when entire tuple passed in has extra attributes and strictSchemaMatch is false" in { val inputSchema = - Schema.builder().add(stringAttribute).add(integerAttribute).add(boolAttribute).build() + Schema().add(stringAttribute).add(integerAttribute).add(boolAttribute) val inputTuple = Tuple .builder(inputSchema) .add(integerAttribute, 1) @@ -78,7 +77,7 @@ class TupleSpec extends AnyFlatSpec { .add(boolAttribute, true) .build() - val outputSchema = Schema.builder().add(stringAttribute).add(integerAttribute).build() + val outputSchema = Schema().add(stringAttribute).add(integerAttribute) val outputTuple = Tuple.builder(outputSchema).add(inputTuple, false).build() // This is the important test. Input tuple has 3 attributes but output tuple has only 2 @@ -88,7 +87,7 @@ class TupleSpec extends AnyFlatSpec { it should "produce identical strings" in { val inputSchema = - Schema.builder().add(stringAttribute).add(integerAttribute).add(boolAttribute).build() + Schema().add(stringAttribute).add(integerAttribute).add(boolAttribute) val inputTuple = Tuple .builder(inputSchema) .add(integerAttribute, 1) @@ -104,8 +103,7 @@ class TupleSpec extends AnyFlatSpec { it should "calculate hash" in { val inputSchema = - Schema - .builder() + Schema() .add(integerAttribute) .add(stringAttribute) .add(boolAttribute) @@ -113,7 +111,6 @@ class TupleSpec extends AnyFlatSpec { .add(doubleAttribute) .add(timestampAttribute) .add(binaryAttribute) - .build() val inputTuple = Tuple .builder(inputSchema) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala index e60040eb467..c7fd35a93bc 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/SpecialPhysicalOpFactory.scala @@ -44,11 +44,7 @@ object SpecialPhysicalOpFactory { case SET_SNAPSHOT | SINGLE_SNAPSHOT => if (inputSchema.containsAttribute(ProgressiveUtils.insertRetractFlagAttr.getName)) { // with insert/retract delta: remove the flag column - Schema - .builder() - .add(inputSchema) - .remove(ProgressiveUtils.insertRetractFlagAttr.getName) - .build() + inputSchema.remove(ProgressiveUtils.insertRetractFlagAttr.getName) } else { // with insert-only delta: output schema is the same as input schema inputSchema diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala index 0ea2557f4ef..80ad4892782 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/aggregate/AggregateOpDesc.scala @@ -54,15 +54,12 @@ class AggregateOpDesc extends LogicalOp { .withPropagateSchema( SchemaPropagationFunc(inputSchemas => { val inputSchema = inputSchemas(operatorInfo.inputPorts.head.id) - val outputSchema = Schema - .builder() - .add(groupByKeys.map(key => inputSchema.getAttribute(key)): _*) - .add( + val outputSchema = Schema( + groupByKeys.map(key => inputSchema.getAttribute(key)) ++ localAggregations.map(agg => agg.getAggregationAttribute(inputSchema.getAttribute(agg.attribute).getType) ) - ) - .build() + ) Map(PortIdentity(internal = true) -> outputSchema) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala index c17a94e3a40..ca6db486b43 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpDesc.scala @@ -39,12 +39,12 @@ class CartesianProductOpDesc extends LogicalOp { // In this example, the last attribute from the right schema (`dup`) is renamed to `dup#@3` // to avoid conflicts. - val builder = Schema.builder() + var outputSchema = Schema() val leftSchema = inputSchemas(operatorInfo.inputPorts.head.id) val rightSchema = inputSchemas(operatorInfo.inputPorts.last.id) val leftAttributeNames = leftSchema.getAttributeNames val rightAttributeNames = rightSchema.getAttributeNames - builder.add(leftSchema) + outputSchema = outputSchema.add(leftSchema) rightSchema.getAttributes.foreach(attr => { var newName = attr.getName while ( @@ -56,13 +56,12 @@ class CartesianProductOpDesc extends LogicalOp { } if (newName == attr.getName) { // non-duplicate attribute, add to builder as is - builder.add(attr) + outputSchema = outputSchema.add(attr) } else { // renamed the duplicate attribute, construct new Attribute - builder.add(new Attribute(newName, attr.getType)) + outputSchema = outputSchema.add(new Attribute(newName, attr.getType)) } }) - val outputSchema = builder.build() Map(operatorInfo.outputPorts.head.id -> outputSchema) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala index 2a82b03d10b..3b5a60f2220 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpDesc.scala @@ -2,7 +2,8 @@ package edu.uci.ics.amber.operator.dictionary import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.map.MapOpDesc @@ -49,11 +50,8 @@ class DictionaryMatcherOpDesc extends MapOpDesc { SchemaPropagationFunc(inputSchemas => { if (resultAttribute == null || resultAttribute.trim.isEmpty) return null Map( - operatorInfo.outputPorts.head.id -> Schema - .builder() - .add(inputSchemas.values.head) - .add(resultAttribute, AttributeType.BOOLEAN) - .build() + operatorInfo.outputPorts.head.id -> inputSchemas.values.head + .add(new Attribute(resultAttribute, AttributeType.BOOLEAN)) ) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala index 756f468f46d..9b8429e4e18 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpDesc.scala @@ -4,14 +4,13 @@ import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} -import edu.uci.ics.amber.core.workflow._ -import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.core.virtualidentity.{ ExecutionIdentity, PhysicalOpIdentity, WorkflowIdentity } -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalLink, PortIdentity} +import edu.uci.ics.amber.core.workflow._ +import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.hashJoin.HashJoinOpDesc.HASH_JOIN_INTERNAL_KEY_NAME import edu.uci.ics.amber.operator.metadata.annotations.{ AutofillAttributeName, @@ -79,11 +78,9 @@ class HashJoinOpDesc[K] extends LogicalOp { .withPropagateSchema( SchemaPropagationFunc(inputSchemas => Map( - PortIdentity(internal = true) -> Schema - .builder() - .add(HASH_JOIN_INTERNAL_KEY_NAME, AttributeType.ANY) - .add(inputSchemas(operatorInfo.inputPorts.head.id)) - .build() + PortIdentity(internal = true) -> Schema( + List(new Attribute(HASH_JOIN_INTERNAL_KEY_NAME, AttributeType.ANY)) + ).add(inputSchemas(operatorInfo.inputPorts.head.id)) ) ) ) @@ -121,27 +118,29 @@ class HashJoinOpDesc[K] extends LogicalOp { SchemaPropagationFunc(inputSchemas => { val buildSchema = inputSchemas(PortIdentity(internal = true)) val probeSchema = inputSchemas(PortIdentity(1)) - val builder = Schema.builder() - builder.add(buildSchema) - builder.removeIfExists(HASH_JOIN_INTERNAL_KEY_NAME) - val leftAttributeNames = buildSchema.getAttributeNames - val rightAttributeNames = - probeSchema.getAttributeNames.filterNot(name => name == probeAttributeName) - - // Create a Map from rightTuple's fields, renaming conflicts - rightAttributeNames - .foreach { name => - var newName = name - while ( - leftAttributeNames.contains(newName) || rightAttributeNames - .filter(attrName => name != attrName) - .contains(newName) - ) { - newName = s"$newName#@1" + + // Start with the attributes from the build schema, excluding the hash join internal key + val leftAttributes = + buildSchema.getAttributes.filterNot(_.getName == HASH_JOIN_INTERNAL_KEY_NAME) + val leftAttributeNames = leftAttributes.map(_.getName).toSet + + // Filter and rename attributes from the probe schema to avoid conflicts + val rightAttributes = probeSchema.getAttributes + .filterNot(_.getName == probeAttributeName) + .map { attr => + var newName = attr.getName + while (leftAttributeNames.contains(newName)) { + val suffixIndex = """#@(\d+)$""".r + .findFirstMatchIn(newName) + .map(_.group(1).toInt + 1) + .getOrElse(1) + newName = s"${attr.getName}#@$suffixIndex" } - builder.add(new Attribute(newName, probeSchema.getAttribute(name).getType)) + new Attribute(newName, attr.getType) } - val outputSchema = builder.build() + + // Combine left and right attributes into a new schema + val outputSchema = Schema(leftAttributes ++ rightAttributes) Map(PortIdentity() -> outputSchema) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala index dcef5abf438..283f2168b10 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceIrisLogisticRegressionOpDesc.scala @@ -99,12 +99,9 @@ class HuggingFaceIrisLogisticRegressionOpDesc extends PythonOperatorDescriptor { ) throw new RuntimeException("Result attribute name should not be empty") Map( - operatorInfo.outputPorts.head.id -> Schema - .builder() - .add(inputSchemas(operatorInfo.inputPorts.head.id)) + operatorInfo.outputPorts.head.id -> inputSchemas(operatorInfo.inputPorts.head.id) .add(predictionClassName, AttributeType.STRING) .add(predictionProbabilityName, AttributeType.DOUBLE) - .build() ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala index 5e9027951a9..875bc5d1b61 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSentimentAnalysisOpDesc.scala @@ -87,13 +87,10 @@ class HuggingFaceSentimentAnalysisOpDesc extends PythonOperatorDescriptor { ) return null Map( - operatorInfo.outputPorts.head.id -> Schema - .builder() - .add(inputSchemas(operatorInfo.inputPorts.head.id)) + operatorInfo.outputPorts.head.id -> inputSchemas(operatorInfo.inputPorts.head.id) .add(resultAttributePositive, AttributeType.DOUBLE) .add(resultAttributeNeutral, AttributeType.DOUBLE) .add(resultAttributeNegative, AttributeType.DOUBLE) - .build() ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala index 4257c17a6d5..d12dceffc40 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceSpamSMSDetectionOpDesc.scala @@ -58,12 +58,9 @@ class HuggingFaceSpamSMSDetectionOpDesc extends PythonOperatorDescriptor { inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { Map( - operatorInfo.outputPorts.head.id -> Schema - .builder() - .add(inputSchemas.values.head) + operatorInfo.outputPorts.head.id -> inputSchemas.values.head .add(resultAttributeSpam, AttributeType.BOOLEAN) .add(resultAttributeProbability, AttributeType.DOUBLE) - .build() ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala index e79369fb959..86b3059d36d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/huggingFace/HuggingFaceTextSummarizationOpDesc.scala @@ -63,11 +63,8 @@ class HuggingFaceTextSummarizationOpDesc extends PythonOperatorDescriptor { if (resultAttribute == null || resultAttribute.trim.isEmpty) throw new RuntimeException("Result attribute name should be given") Map( - operatorInfo.outputPorts.head.id -> Schema - .builder() - .add(inputSchemas.values.head) + operatorInfo.outputPorts.head.id -> inputSchemas.values.head .add(resultAttribute, AttributeType.STRING) - .build() ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala index 764a42b2708..dd61c510a1a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalJoinOpDesc.scala @@ -92,19 +92,21 @@ class IntervalJoinOpDesc extends LogicalOp { .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( SchemaPropagationFunc(inputSchemas => { - val builder: Schema.Builder = Schema.builder() val leftTableSchema: Schema = inputSchemas(operatorInfo.inputPorts.head.id) val rightTableSchema: Schema = inputSchemas(operatorInfo.inputPorts.last.id) - builder.add(leftTableSchema) - rightTableSchema.getAttributes - .map(attr => { - if (leftTableSchema.containsAttribute(attr.getName)) { - builder.add(new Attribute(s"${attr.getName}#@1", attr.getType)) + + // Start with the left table schema + val outputSchema = rightTableSchema.getAttributes.foldLeft(leftTableSchema) { + (currentSchema, attr) => + if (currentSchema.containsAttribute(attr.getName)) { + // Add the attribute with a suffix to avoid conflicts + currentSchema.add(new Attribute(s"${attr.getName}#@1", attr.getType)) } else { - builder.add(attr.getName, attr.getType) + // Add the attribute as is + currentSchema.add(attr) } - }) - val outputSchema = builder.build() + } + Map(operatorInfo.outputPorts.head.id -> outputSchema) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala index 62ca41b34eb..a73ceb3ae14 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/Scorer/MachineLearningScorerOpDesc.scala @@ -67,21 +67,22 @@ class MachineLearningScorerOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchemaBuilder = Schema.builder() - if (!isRegression) { - outputSchemaBuilder.add(new Attribute("Class", AttributeType.STRING)) - } - val metrics = if (isRegression) { regressionMetrics.map(_.getName()) } else { classificationMetrics.map(_.getName()) } - metrics.foreach(metricName => { - outputSchemaBuilder.add(new Attribute(metricName, AttributeType.DOUBLE)) - }) + val baseSchema = if (!isRegression) { + Schema(List(new Attribute("Class", AttributeType.STRING))) + } else { + Schema(List()) + } + + val outputSchema = metrics.foldLeft(baseSchema) { (currentSchema, metricName) => + currentSchema.add(new Attribute(metricName, AttributeType.DOUBLE)) + } - Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) + Map(operatorInfo.outputPorts.head.id -> outputSchema) } // private def getClassificationScorerName(scorer: classificationMetricsFnc): String = { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala index 0d35b6cbc85..5e035fd1b5c 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/machineLearning/sklearnAdvanced/base/SklearnAdvancedBaseDesc.scala @@ -152,10 +152,13 @@ abstract class SklearnMLOperatorDescriptor[T <: ParamClass] extends PythonOperat override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchemaBuilder = Schema.builder() - outputSchemaBuilder.add(new Attribute("Model", AttributeType.BINARY)) - outputSchemaBuilder.add(new Attribute("Parameters", AttributeType.STRING)) + val outputSchema = Schema( + List( + new Attribute("Model", AttributeType.BINARY), + new Attribute("Parameters", AttributeType.STRING) + ) + ) - Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) + Map(operatorInfo.outputPorts.head.id -> outputSchema) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala index 39183a07ea5..120b5996910 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/projection/ProjectionOpDesc.scala @@ -1,17 +1,15 @@ package edu.uci.ics.amber.operator.projection import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import com.google.common.base.Preconditions import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.tuple.{Attribute, Schema} +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.PhysicalOp.oneToOnePhysicalOp import edu.uci.ics.amber.core.workflow._ import edu.uci.ics.amber.operator.map.MapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.util.JSONUtils.objectMapper -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} class ProjectionOpDesc extends MapOpDesc { @@ -39,27 +37,21 @@ class ProjectionOpDesc extends MapOpDesc { .withOutputPorts(operatorInfo.outputPorts) .withDerivePartition(derivePartition()) .withPropagateSchema(SchemaPropagationFunc(inputSchemas => { - Preconditions.checkArgument(attributes.nonEmpty) + require(attributes.nonEmpty, "Attributes must not be empty") + val inputSchema = inputSchemas.values.head val outputSchema = if (!isDrop) { - Schema - .builder() - .add(attributes.map { attribute => - val originalType = inputSchema.getAttribute(attribute.getOriginalAttribute).getType - new Attribute(attribute.getAlias, originalType) - }) - .build() + attributes.foldLeft(Schema()) { (schema, attribute) => + val originalType = inputSchema.getAttribute(attribute.getOriginalAttribute).getType + schema.add(attribute.getAlias, originalType) + } } else { - val outputSchemaBuilder = Schema.builder() - outputSchemaBuilder.add(inputSchema) - for (attribute <- attributes) { - outputSchemaBuilder.removeIfExists(attribute.getOriginalAttribute) + attributes.foldLeft(inputSchema) { (schema, attribute) => + schema.remove(attribute.getOriginalAttribute) } - outputSchemaBuilder.build() } - Map( - operatorInfo.outputPorts.head.id -> outputSchema - ) + + Map(operatorInfo.outputPorts.head.id -> outputSchema) })) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala index 155380851ba..0ef7545be8b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sentiment/SentimentAnalysisOpDesc.scala @@ -3,14 +3,13 @@ package edu.uci.ics.amber.operator.sentiment import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaInject import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} -import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} +import edu.uci.ics.amber.core.tuple.AttributeType +import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.map.MapOpDesc -import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName +import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.util.JSONUtils.objectMapper -import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort} @JsonSchemaInject(json = """ { @@ -59,11 +58,8 @@ class SentimentAnalysisOpDesc extends MapOpDesc { if (resultAttribute == null || resultAttribute.trim.isEmpty) return null Map( - operatorInfo.outputPorts.head.id -> Schema - .builder() - .add(inputSchemas.values.head) + operatorInfo.outputPorts.head.id -> inputSchemas.values.head .add(resultAttribute, AttributeType.INTEGER) - .build() ) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala index 2279f4126dd..190f271eb6e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnClassifierOpDesc.scala @@ -110,11 +110,9 @@ abstract class SklearnClassifierOpDesc extends PythonOperatorDescriptor { inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { Map( - operatorInfo.outputPorts.head.id -> Schema - .builder() + operatorInfo.outputPorts.head.id -> Schema() .add("model_name", AttributeType.STRING) .add("model", AttributeType.BINARY) - .build() ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala index 35e0e7d4d9d..430b9208e3b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnLinearRegressionOpDesc.scala @@ -63,11 +63,9 @@ class SklearnLinearRegressionOpDesc extends PythonOperatorDescriptor { inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { Map( - operatorInfo.outputPorts.head.id -> Schema - .builder() + operatorInfo.outputPorts.head.id -> Schema() .add("model_name", AttributeType.STRING) .add("model", AttributeType.BINARY) - .build() ) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala index 6e3c8ae5cd7..4653c1b6983 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/sklearn/SklearnPredictionOpDesc.scala @@ -68,11 +68,8 @@ class SklearnPredictionOpDesc extends PythonOperatorDescriptor { inputSchema.attributes.find(attr => attr.getName == groundTruthAttribute).get.getType } Map( - operatorInfo.outputPorts.head.id -> Schema - .builder() - .add(inputSchema) + operatorInfo.outputPorts.head.id -> inputSchema .add(resultAttribute, resultType) - .build() ) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala index 6213cc26ded..3ffe3d63594 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/reddit/RedditSearchSourceOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.source.apis.reddit import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.source.PythonSourceOperatorDescriptor import edu.uci.ics.amber.core.workflow.{OutputPort, PortIdentity} @@ -112,28 +112,24 @@ class RedditSearchSourceOpDesc extends PythonSourceOperatorDescriptor { override def asSource() = true override def sourceSchema(): Schema = - Schema - .builder() - .add( - new Attribute("id", AttributeType.STRING), - new Attribute("name", AttributeType.STRING), - new Attribute("title", AttributeType.STRING), - new Attribute("created_utc", AttributeType.TIMESTAMP), - new Attribute("edited", AttributeType.TIMESTAMP), - new Attribute("is_self", AttributeType.BOOLEAN), - new Attribute("selftext", AttributeType.STRING), - new Attribute("over_18", AttributeType.BOOLEAN), - new Attribute("is_original_content", AttributeType.BOOLEAN), - new Attribute("locked", AttributeType.BOOLEAN), - new Attribute("score", AttributeType.INTEGER), - new Attribute("upvote_ratio", AttributeType.DOUBLE), - new Attribute("num_comments", AttributeType.INTEGER), - new Attribute("permalink", AttributeType.STRING), - new Attribute("url", AttributeType.STRING), - new Attribute("author_name", AttributeType.STRING), - new Attribute("subreddit", AttributeType.STRING) - ) - .build() + Schema() + .add("id", AttributeType.STRING) + .add("name", AttributeType.STRING) + .add("title", AttributeType.STRING) + .add("created_utc", AttributeType.TIMESTAMP) + .add("edited", AttributeType.TIMESTAMP) + .add("is_self", AttributeType.BOOLEAN) + .add("selftext", AttributeType.STRING) + .add("over_18", AttributeType.BOOLEAN) + .add("is_original_content", AttributeType.BOOLEAN) + .add("locked", AttributeType.BOOLEAN) + .add("score", AttributeType.INTEGER) + .add("upvote_ratio", AttributeType.DOUBLE) + .add("num_comments", AttributeType.INTEGER) + .add("permalink", AttributeType.STRING) + .add("url", AttributeType.STRING) + .add("author_name", AttributeType.STRING) + .add("subreddit", AttributeType.STRING) def getOutputSchemas(inputSchemas: Map[PortIdentity, Schema]): Map[PortIdentity, Schema] = { Map(operatorInfo.outputPorts.head.id -> sourceSchema()) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala index c3a92cbcadd..d4050625b1b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterFullArchiveSearchSourceOpDesc.scala @@ -7,7 +7,7 @@ import com.kjetland.jackson.jsonSchema.annotations.{ JsonSchemaTitle } import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.apis.twitter.TwitterSourceOpDesc @@ -65,44 +65,39 @@ class TwitterFullArchiveSearchSourceOpDesc extends TwitterSourceOpDesc { // twitter schema is hard coded for now. V2 API has changed many fields of the Tweet object. // we are also currently depending on redouane59/twittered client library to parse tweet fields. - - Schema - .builder() - .add( - new Attribute("id", AttributeType.STRING), - new Attribute("text", AttributeType.STRING), - new Attribute("created_at", AttributeType.TIMESTAMP), - new Attribute("lang", AttributeType.STRING), - new Attribute("tweet_type", AttributeType.STRING), - new Attribute("place_id", AttributeType.STRING), - new Attribute("place_coordinate", AttributeType.STRING), - new Attribute("in_reply_to_status_id", AttributeType.STRING), - new Attribute("in_reply_to_user_id", AttributeType.STRING), - new Attribute("like_count", AttributeType.LONG), - new Attribute("quote_count", AttributeType.LONG), - new Attribute("reply_count", AttributeType.LONG), - new Attribute("retweet_count", AttributeType.LONG), - new Attribute("hashtags", AttributeType.STRING), - new Attribute("symbols", AttributeType.STRING), - new Attribute("urls", AttributeType.STRING), - new Attribute("mentions", AttributeType.STRING), - new Attribute("user_id", AttributeType.STRING), - new Attribute("user_created_at", AttributeType.TIMESTAMP), - new Attribute("user_name", AttributeType.STRING), - new Attribute("user_display_name", AttributeType.STRING), - new Attribute("user_lang", AttributeType.STRING), - new Attribute("user_description", AttributeType.STRING), - new Attribute("user_followers_count", AttributeType.LONG), - new Attribute("user_following_count", AttributeType.LONG), - new Attribute("user_tweet_count", AttributeType.LONG), - new Attribute("user_listed_count", AttributeType.LONG), - new Attribute("user_location", AttributeType.STRING), - new Attribute("user_url", AttributeType.STRING), - new Attribute("user_profile_image_url", AttributeType.STRING), - new Attribute("user_pinned_tweet_id", AttributeType.STRING), - new Attribute("user_protected", AttributeType.BOOLEAN), - new Attribute("user_verified", AttributeType.BOOLEAN) - ) - .build() + Schema() + .add("id", AttributeType.STRING) + .add("text", AttributeType.STRING) + .add("created_at", AttributeType.TIMESTAMP) + .add("lang", AttributeType.STRING) + .add("tweet_type", AttributeType.STRING) + .add("place_id", AttributeType.STRING) + .add("place_coordinate", AttributeType.STRING) + .add("in_reply_to_status_id", AttributeType.STRING) + .add("in_reply_to_user_id", AttributeType.STRING) + .add("like_count", AttributeType.LONG) + .add("quote_count", AttributeType.LONG) + .add("reply_count", AttributeType.LONG) + .add("retweet_count", AttributeType.LONG) + .add("hashtags", AttributeType.STRING) + .add("symbols", AttributeType.STRING) + .add("urls", AttributeType.STRING) + .add("mentions", AttributeType.STRING) + .add("user_id", AttributeType.STRING) + .add("user_created_at", AttributeType.TIMESTAMP) + .add("user_name", AttributeType.STRING) + .add("user_display_name", AttributeType.STRING) + .add("user_lang", AttributeType.STRING) + .add("user_description", AttributeType.STRING) + .add("user_followers_count", AttributeType.LONG) + .add("user_following_count", AttributeType.LONG) + .add("user_tweet_count", AttributeType.LONG) + .add("user_listed_count", AttributeType.LONG) + .add("user_location", AttributeType.STRING) + .add("user_url", AttributeType.STRING) + .add("user_profile_image_url", AttributeType.STRING) + .add("user_pinned_tweet_id", AttributeType.STRING) + .add("user_protected", AttributeType.BOOLEAN) + .add("user_verified", AttributeType.BOOLEAN) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala index 15b0ddfaf21..20960c3181d 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/apis/twitter/v2/TwitterSearchSourceOpDesc.scala @@ -7,7 +7,7 @@ import com.kjetland.jackson.jsonSchema.annotations.{ JsonSchemaTitle } import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget import edu.uci.ics.amber.operator.source.apis.twitter.TwitterSourceOpDesc @@ -56,43 +56,39 @@ class TwitterSearchSourceOpDesc extends TwitterSourceOpDesc { // twitter schema is hard coded for now. V2 API has changed many fields of the Tweet object. // we are also currently depending on redouane59/twittered client library to parse tweet fields. - Schema - .builder() - .add( - new Attribute("id", AttributeType.STRING), - new Attribute("text", AttributeType.STRING), - new Attribute("created_at", AttributeType.TIMESTAMP), - new Attribute("lang", AttributeType.STRING), - new Attribute("tweet_type", AttributeType.STRING), - new Attribute("place_id", AttributeType.STRING), - new Attribute("place_coordinate", AttributeType.STRING), - new Attribute("in_reply_to_status_id", AttributeType.STRING), - new Attribute("in_reply_to_user_id", AttributeType.STRING), - new Attribute("like_count", AttributeType.LONG), - new Attribute("quote_count", AttributeType.LONG), - new Attribute("reply_count", AttributeType.LONG), - new Attribute("retweet_count", AttributeType.LONG), - new Attribute("hashtags", AttributeType.STRING), - new Attribute("symbols", AttributeType.STRING), - new Attribute("urls", AttributeType.STRING), - new Attribute("mentions", AttributeType.STRING), - new Attribute("user_id", AttributeType.STRING), - new Attribute("user_created_at", AttributeType.TIMESTAMP), - new Attribute("user_name", AttributeType.STRING), - new Attribute("user_display_name", AttributeType.STRING), - new Attribute("user_lang", AttributeType.STRING), - new Attribute("user_description", AttributeType.STRING), - new Attribute("user_followers_count", AttributeType.LONG), - new Attribute("user_following_count", AttributeType.LONG), - new Attribute("user_tweet_count", AttributeType.LONG), - new Attribute("user_listed_count", AttributeType.LONG), - new Attribute("user_location", AttributeType.STRING), - new Attribute("user_url", AttributeType.STRING), - new Attribute("user_profile_image_url", AttributeType.STRING), - new Attribute("user_pinned_tweet_id", AttributeType.STRING), - new Attribute("user_protected", AttributeType.BOOLEAN), - new Attribute("user_verified", AttributeType.BOOLEAN) - ) - .build() + Schema() + .add("id", AttributeType.STRING) + .add("text", AttributeType.STRING) + .add("created_at", AttributeType.TIMESTAMP) + .add("lang", AttributeType.STRING) + .add("tweet_type", AttributeType.STRING) + .add("place_id", AttributeType.STRING) + .add("place_coordinate", AttributeType.STRING) + .add("in_reply_to_status_id", AttributeType.STRING) + .add("in_reply_to_user_id", AttributeType.STRING) + .add("like_count", AttributeType.LONG) + .add("quote_count", AttributeType.LONG) + .add("reply_count", AttributeType.LONG) + .add("retweet_count", AttributeType.LONG) + .add("hashtags", AttributeType.STRING) + .add("symbols", AttributeType.STRING) + .add("urls", AttributeType.STRING) + .add("mentions", AttributeType.STRING) + .add("user_id", AttributeType.STRING) + .add("user_created_at", AttributeType.TIMESTAMP) + .add("user_name", AttributeType.STRING) + .add("user_display_name", AttributeType.STRING) + .add("user_lang", AttributeType.STRING) + .add("user_description", AttributeType.STRING) + .add("user_followers_count", AttributeType.LONG) + .add("user_following_count", AttributeType.LONG) + .add("user_tweet_count", AttributeType.LONG) + .add("user_listed_count", AttributeType.LONG) + .add("user_location", AttributeType.STRING) + .add("user_url", AttributeType.STRING) + .add("user_profile_image_url", AttributeType.STRING) + .add("user_pinned_tweet_id", AttributeType.STRING) + .add("user_protected", AttributeType.BOOLEAN) + .add("user_verified", AttributeType.BOOLEAN) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala index 49f5028d718..054af768565 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/fetcher/URLFetcherOpDesc.scala @@ -28,17 +28,11 @@ class URLFetcherOpDesc extends SourceOperatorDescriptor { var decodingMethod: DecodingMethod = _ override def sourceSchema(): Schema = { - Schema - .builder() + Schema() .add( "URL content", - if (decodingMethod == DecodingMethod.UTF_8) { - AttributeType.STRING - } else { - AttributeType.ANY - } + if (decodingMethod == DecodingMethod.UTF_8) AttributeType.STRING else AttributeType.ANY ) - .build() } override def getPhysicalOp( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala index 90c65c87eb9..437d2f2bbe3 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/FileScanSourceOpDesc.scala @@ -7,7 +7,7 @@ import com.kjetland.jackson.jsonSchema.annotations.{ JsonSchemaTitle } import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.annotations.HideAnnotation @@ -66,8 +66,10 @@ class FileScanSourceOpDesc extends ScanSourceOpDesc with TextSourceOpDesc { } override def sourceSchema(): Schema = { - val builder = Schema.builder() - if (outputFileName) builder.add(new Attribute("filename", AttributeType.STRING)) - builder.add(new Attribute(attributeName, attributeType.getType)).build() + var schema = Schema() + if (outputFileName) { + schema = schema.add("filename", AttributeType.STRING) + } + schema.add(attributeName, attributeType.getType) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala index cd2fdda4bdf..f689611c5f8 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/CSVScanSourceOpDesc.scala @@ -6,7 +6,7 @@ import com.univocity.parsers.csv.{CsvFormat, CsvParser, CsvParserSettings} import edu.uci.ics.amber.core.executor.OpExecWithClassName import edu.uci.ics.amber.core.storage.DocumentFactory import edu.uci.ics.amber.core.tuple.AttributeTypeUtils.inferSchemaFromRows -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.source.scan.ScanSourceOpDesc import edu.uci.ics.amber.util.JSONUtils.objectMapper @@ -94,10 +94,9 @@ class CSVScanSourceOpDesc extends ScanSourceOpDesc { if (hasHeader) parser.getContext.headers() else (1 to attributeTypeList.length).map(i => "column-" + i).toArray - Schema - .builder() - .add(header.indices.map(i => new Attribute(header(i), attributeTypeList(i)))) - .build() + header.indices.foldLeft(Schema()) { (schema, i) => + schema.add(header(i), attributeTypeList(i)) + } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala index 4d4202da703..c5bffbf2080 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csv/ParallelCSVScanSourceOpDesc.scala @@ -86,18 +86,12 @@ class ParallelCSVScanSourceOpDesc extends ScanSourceOpDesc { reader.close() // build schema based on inferred AttributeTypes - Schema - .builder() - .add( - firstRow.indices - .map((i: Int) => - new Attribute( - if (hasHeader) firstRow.apply(i) else "column-" + (i + 1), - attributeTypeList.apply(i) - ) - ) + Schema().add(firstRow.indices.map { i => + new Attribute( + if (hasHeader) firstRow(i) else s"column-${i + 1}", + attributeTypeList(i) ) - .build() + }) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala index 9ea25e13147..f4a3c427fc7 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/csvOld/CSVOldScanSourceOpDesc.scala @@ -84,18 +84,12 @@ class CSVOldScanSourceOpDesc extends ScanSourceOpDesc { reader.close() // build schema based on inferred AttributeTypes - Schema - .builder() - .add( - firstRow.indices - .map((i: Int) => - new Attribute( - if (hasHeader) firstRow.apply(i) else "column-" + (i + 1), - attributeTypeList.apply(i) - ) - ) + Schema().add(firstRow.indices.map { i => + new Attribute( + if (hasHeader) firstRow(i) else s"column-${i + 1}", + attributeTypeList(i) ) - .build() + }) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala index 9a9deee9bbc..f0d7eb0c789 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/json/JSONLScanSourceOpDesc.scala @@ -99,13 +99,9 @@ class JSONLScanSourceOpDesc extends ScanSourceOpDesc { result.toArray })) - Schema - .builder() - .add( - sortedFieldNames.indices - .map(i => new Attribute(sortedFieldNames(i), attributeTypes(i))) - ) - .build() + Schema().add(sortedFieldNames.indices.map { i => + new Attribute(sortedFieldNames(i), attributeTypes(i)) + }) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala index bdb59fff827..597424a068b 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/scan/text/TextInputSourceOpDesc.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.source.scan.text import com.fasterxml.jackson.annotation.JsonProperty import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.{Attribute, Schema} +import edu.uci.ics.amber.core.tuple.Schema import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.UIWidget @@ -39,10 +39,7 @@ class TextInputSourceOpDesc extends SourceOperatorDescriptor with TextSourceOpDe ) override def sourceSchema(): Schema = - Schema - .builder() - .add(new Attribute(attributeName, attributeType.getType)) - .build() + Schema().add(attributeName, attributeType.getType) override def operatorInfo: OperatorInfo = OperatorInfo( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpDesc.scala index 77113ff4660..c688e8e8302 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/SQLSourceOpDesc.scala @@ -127,52 +127,55 @@ abstract class SQLSourceOpDesc extends SourceOperatorDescriptor { } updatePort() - val schemaBuilder = Schema.builder() try { + val attributes = scala.collection.mutable.ListBuffer[Attribute]() val connection = establishConn connection.setReadOnly(true) val databaseMetaData = connection.getMetaData val columns = databaseMetaData.getColumns(null, null, this.table, null) - while ({ - columns.next - }) { + while (columns.next()) { val columnName = columns.getString("COLUMN_NAME") val datatype = columns.getInt("DATA_TYPE") - datatype match { + + // Map JDBC data types to AttributeType + val attributeType = datatype match { case Types.TINYINT | // -6 Types.TINYINT Types.SMALLINT | // 5 Types.SMALLINT Types.INTEGER => // 4 Types.INTEGER - schemaBuilder.add(new Attribute(columnName, AttributeType.INTEGER)) + AttributeType.INTEGER case Types.FLOAT | // 6 Types.FLOAT Types.REAL | // 7 Types.REAL Types.DOUBLE | // 8 Types.DOUBLE Types.NUMERIC => // 3 Types.NUMERIC - schemaBuilder.add(new Attribute(columnName, AttributeType.DOUBLE)) + AttributeType.DOUBLE case Types.BIT | // -7 Types.BIT Types.BOOLEAN => // 16 Types.BOOLEAN - schemaBuilder.add(new Attribute(columnName, AttributeType.BOOLEAN)) - case Types.BINARY => //-2 Types.BINARY - schemaBuilder.add(new Attribute(columnName, AttributeType.BINARY)) - case Types.DATE | //91 Types.DATE - Types.TIME | //92 Types.TIME - Types.LONGVARCHAR | //-1 Types.LONGVARCHAR - Types.CHAR | //1 Types.CHAR - Types.VARCHAR | //12 Types.VARCHAR - Types.NULL | //0 Types.NULL - Types.OTHER => //1111 Types.OTHER - schemaBuilder.add(new Attribute(columnName, AttributeType.STRING)) - case Types.BIGINT => //-5 Types.BIGINT - schemaBuilder.add(new Attribute(columnName, AttributeType.LONG)) + AttributeType.BOOLEAN + case Types.BINARY => // -2 Types.BINARY + AttributeType.BINARY + case Types.DATE | // 91 Types.DATE + Types.TIME | // 92 Types.TIME + Types.LONGVARCHAR | // -1 Types.LONGVARCHAR + Types.CHAR | // 1 Types.CHAR + Types.VARCHAR | // 12 Types.VARCHAR + Types.NULL | // 0 Types.NULL + Types.OTHER => // 1111 Types.OTHER + AttributeType.STRING + case Types.BIGINT => // -5 Types.BIGINT + AttributeType.LONG case Types.TIMESTAMP => // 93 Types.TIMESTAMP - schemaBuilder.add(new Attribute(columnName, AttributeType.TIMESTAMP)) + AttributeType.TIMESTAMP case _ => throw new RuntimeException( this.getClass.getSimpleName + ": unknown data type: " + datatype ) } + + // Add the attribute to the list + attributes += new Attribute(columnName, attributeType) } connection.close() - schemaBuilder.build() + Schema(attributes.toList) } catch { case e @ (_: SQLException | _: ClassCastException) => throw new RuntimeException( diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala index 6f688ae8e68..ccccaa720a4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/source/sql/asterixdb/AsterixDBSourceOpDesc.scala @@ -127,23 +127,22 @@ class AsterixDBSourceOpDesc extends SQLSourceOpDesc { updatePort() - val sb: Schema.Builder = Schema.builder() - - // query dataset's Datatype from Metadata.`Datatype` + // Query dataset's Datatype from Metadata.`Datatype` val datasetDataType = queryAsterixDB( host, port, - "SELECT DatatypeName FROM Metadata.`Dataset` ds where ds.`DatasetName`='" + table + "';", + s"SELECT DatatypeName FROM Metadata.`Dataset` ds where ds.`DatasetName`='$table';", format = "JSON" ).get.next().asInstanceOf[JSONObject].getString("DatatypeName") - // query field types from Metadata.`Datatype` + // Query field types from Metadata.`Datatype` val fields = fetchDataTypeFields(datasetDataType, "", host, port) - for (key <- fields.keys.toList.sorted) { - sb.add(new Attribute(key, attributeTypeFromAsterixDBType(fields(key)))) + // Collect attributes by sorting field names and mapping them to Attribute instances + val attributes = fields.keys.toList.sorted.map { key => + new Attribute(key, attributeTypeFromAsterixDBType(fields(key))) } - sb.build() + Schema(attributes) } private def attributeTypeFromAsterixDBType(inputType: String): AttributeType = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala index fd38d176ae1..f5b3b529589 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/java/JavaUDFOpDesc.scala @@ -72,21 +72,22 @@ class JavaUDFOpDesc extends LogicalOp { val propagateSchema = (inputSchemas: Map[PortIdentity, Schema]) => { val inputSchema = inputSchemas(operatorInfo.inputPorts.head.id) - val outputSchemaBuilder = Schema.builder() - // keep the same schema from input - if (retainInputColumns) outputSchemaBuilder.add(inputSchema) - // for any javaUDFType, it can add custom output columns (attributes). - if (outputColumns != null) { - if (retainInputColumns) { // check if columns are duplicated + var outputSchema = if (retainInputColumns) inputSchema else Schema() + // For any javaUDFType, it can add custom output columns (attributes). + if (outputColumns != null) { + if (retainInputColumns) { + // Check if columns are duplicated for (column <- outputColumns) { if (inputSchema.containsAttribute(column.getName)) throw new RuntimeException("Column name " + column.getName + " already exists!") } } - outputSchemaBuilder.add(outputColumns).build() + // Add custom output columns + outputSchema = outputSchema.add(outputColumns) } - Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) + + Map(operatorInfo.outputPorts.head.id -> outputSchema) } if (workers > 1) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala index a4af16c5415..57f85a663c5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/DualInputPortsPythonUDFOpDescV2.scala @@ -91,22 +91,24 @@ class DualInputPortsPythonUDFOpDescV2 extends LogicalOp { .withPropagateSchema( SchemaPropagationFunc(inputSchemas => { Preconditions.checkArgument(inputSchemas.size == 2) + val inputSchema = inputSchemas(operatorInfo.inputPorts(1).id) - val outputSchemaBuilder = Schema.builder() - // keep the same schema from input - if (retainInputColumns) outputSchemaBuilder.add(inputSchema) - // for any pythonUDFType, it can add custom output columns (attributes). - if (outputColumns != null) { - if (retainInputColumns) { // check if columns are duplicated + var outputSchema = if (retainInputColumns) inputSchema else Schema() + // For any pythonUDFType, add custom output columns (attributes). + if (outputColumns != null) { + if (retainInputColumns) { + // Check if columns are duplicated for (column <- outputColumns) { if (inputSchema.containsAttribute(column.getName)) - throw new RuntimeException("Column name " + column.getName + " already exists!") + throw new RuntimeException(s"Column name ${column.getName} already exists!") } } - outputSchemaBuilder.add(outputColumns).build() + // Add custom output columns + outputSchema = outputSchema.add(outputColumns) } - Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) + + Map(operatorInfo.outputPorts.head.id -> outputSchema) }) ) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala index 056326c8093..ff49ee70ff2 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonLambdaFunctionOpDesc.scala @@ -16,31 +16,33 @@ class PythonLambdaFunctionOpDesc extends PythonOperatorDescriptor { ): Map[PortIdentity, Schema] = { Preconditions.checkArgument(inputSchemas.size == 1) Preconditions.checkArgument(lambdaAttributeUnits.nonEmpty) + val inputSchema = inputSchemas.values.head - val outputSchemaBuilder = Schema.builder() - // keep the same schema from input - outputSchemaBuilder.add(inputSchema) - // add new attributes + var outputSchema = inputSchema + + // Add new attributes for (unit <- lambdaAttributeUnits) { if (unit.attributeName.equalsIgnoreCase("Add New Column")) { - if (inputSchema.containsAttribute(unit.newAttributeName)) { + if (outputSchema.containsAttribute(unit.newAttributeName)) { throw new RuntimeException( - "Column name " + unit.newAttributeName + " already exists!" + s"Column name ${unit.newAttributeName} already exists!" ) } - if (unit.newAttributeName != null && unit.newAttributeName.nonEmpty) - outputSchemaBuilder.add(unit.newAttributeName, unit.attributeType) + if (unit.newAttributeName != null && unit.newAttributeName.nonEmpty) { + outputSchema = outputSchema.add(unit.newAttributeName, unit.attributeType) + } } } - var outputSchema = outputSchemaBuilder.build() - // type casting + + // Type casting for (unit <- lambdaAttributeUnits) { - if (!unit.attributeName.equalsIgnoreCase("Add New Column")) + if (!unit.attributeName.equalsIgnoreCase("Add New Column")) { outputSchema = AttributeTypeUtils.SchemaCasting(outputSchema, unit.attributeName, unit.attributeType) + } } - Map(operatorInfo.outputPorts.head.id -> outputSchema) + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala index 0f6d1988de5..2f636a42421 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonTableReducerOpDesc.scala @@ -15,11 +15,11 @@ class PythonTableReducerOpDesc extends PythonOperatorDescriptor { inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { Preconditions.checkArgument(lambdaAttributeUnits.nonEmpty) - val outputSchemaBuilder = Schema.builder() - for (unit <- lambdaAttributeUnits) { - outputSchemaBuilder.add(unit.attributeName, unit.attributeType) + val outputSchema = lambdaAttributeUnits.foldLeft(Schema()) { (schema, unit) => + schema.add(unit.attributeName, unit.attributeType) } - Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) + + Map(operatorInfo.outputPorts.head.id -> outputSchema) } override def operatorInfo: OperatorInfo = diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala index 216284315cb..1c070636ebd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala @@ -73,23 +73,24 @@ class PythonUDFOpDescV2 extends LogicalOp { } val propagateSchema = (inputSchemas: Map[PortIdentity, Schema]) => { - // Preconditions.checkArgument(schemas.length == 1) val inputSchema = inputSchemas(operatorInfo.inputPorts.head.id) - val outputSchemaBuilder = Schema.builder() - // keep the same schema from input - if (retainInputColumns) outputSchemaBuilder.add(inputSchema) - // for any pythonUDFType, it can add custom output columns (attributes). - if (outputColumns != null) { - if (retainInputColumns) { // check if columns are duplicated + var outputSchema = if (retainInputColumns) inputSchema else Schema() + // Add custom output columns if defined + if (outputColumns != null) { + if (retainInputColumns) { + // Check for duplicate column names for (column <- outputColumns) { - if (inputSchema.containsAttribute(column.getName)) - throw new RuntimeException("Column name " + column.getName + " already exists!") + if (inputSchema.containsAttribute(column.getName)) { + throw new RuntimeException(s"Column name ${column.getName} already exists!") + } } } - outputSchemaBuilder.add(outputColumns).build() + // Add output columns to the schema + outputSchema = outputSchema.add(outputColumns) } - Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) + + Map(operatorInfo.outputPorts.head.id -> outputSchema) } if (workers > 1) { diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala index a219ba2808a..ca45e0408d0 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/source/PythonUDFSourceOpDescV2.scala @@ -72,10 +72,10 @@ class PythonUDFSourceOpDescV2 extends SourceOperatorDescriptor { } override def sourceSchema(): Schema = { - val outputSchemaBuilder = Schema.builder() - if (columns.nonEmpty && columns != null) { - outputSchemaBuilder.add(columns) + if (columns != null && columns.nonEmpty) { + Schema().add(columns) + } else { + Schema() } - outputSchemaBuilder.build() } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala index bc9d6ec1b5e..5e0815f506a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFOpDesc.scala @@ -73,21 +73,23 @@ class RUDFOpDesc extends LogicalOp { val propagateSchema = (inputSchemas: Map[PortIdentity, Schema]) => { val inputSchema = inputSchemas(operatorInfo.inputPorts.head.id) - val outputSchemaBuilder = Schema.builder() - // keep the same schema from input - if (retainInputColumns) outputSchemaBuilder.add(inputSchema) - // for any javaUDFType, it can add custom output columns (attributes). - if (outputColumns != null) { - if (retainInputColumns) { // check if columns are duplicated + var outputSchema = if (retainInputColumns) inputSchema else Schema() + // Add custom output columns if provided + if (outputColumns != null) { + if (retainInputColumns) { + // Check for duplicate column names for (column <- outputColumns) { - if (inputSchema.containsAttribute(column.getName)) - throw new RuntimeException("Column name " + column.getName + " already exists!") + if (inputSchema.containsAttribute(column.getName)) { + throw new RuntimeException(s"Column name ${column.getName} already exists!") + } } } - outputSchemaBuilder.add(outputColumns).build() + // Add output columns to the schema + outputSchema = outputSchema.add(outputColumns) } - Map(operatorInfo.outputPorts.head.id -> outputSchemaBuilder.build()) + + Map(operatorInfo.outputPorts.head.id -> outputSchema) } val r_operator_type = if (useTupleAPI) "r-tuple" else "r-table" diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala index 19f65d42c0d..e84900f8ddf 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/r/RUDFSourceOpDesc.scala @@ -86,10 +86,10 @@ class RUDFSourceOpDesc extends SourceOperatorDescriptor { } override def sourceSchema(): Schema = { - val outputSchemaBuilder = Schema.builder() - if (columns.nonEmpty && columns != null) { - outputSchemaBuilder.add(columns) + if (columns != null && columns.nonEmpty) { + Schema().add(columns) + } else { + Schema() } - outputSchemaBuilder.build() } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala index 2eb0fefa152..6db50ce069e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.unneststring import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.AttributeType import edu.uci.ics.amber.core.workflow.{PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.flatmap.FlatMapOpDesc import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -53,14 +53,10 @@ class UnnestStringOpDesc extends FlatMapOpDesc { .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( SchemaPropagationFunc(inputSchemas => { - val outputSchema = - if (resultAttribute == null || resultAttribute.trim.isEmpty) null - else - Schema - .builder() - .add(inputSchemas.values.head) - .add(resultAttribute, AttributeType.STRING) - .build() + val outputSchema = Option(resultAttribute) + .filter(_.trim.nonEmpty) + .map(attr => inputSchemas.values.head.add(attr, AttributeType.STRING)) + .getOrElse(throw new RuntimeException("Result attribute cannot be empty")) Map(operatorInfo.outputPorts.head.id -> outputSchema) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala index ff082be7b3d..6a41a1d3e72 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/DotPlot/DotPlotOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.DotPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} @@ -20,10 +20,9 @@ class DotPlotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala index 16e682b4163..2928d379c94 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/IcicleChart/IcicleChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.IcicleChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -37,10 +37,9 @@ class IcicleChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala index 5e85d1979b2..ccbf5f8f352 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ImageViz/ImageVisualizerOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.ImageViz import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -19,10 +19,9 @@ class ImageVisualizerOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala index 4b6a366d7c4..19a113a7c0a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ScatterMatrixChart/ScatterMatrixChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.ScatterMatrixChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.{ @@ -37,10 +37,9 @@ class ScatterMatrixChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala index c3924b3275d..258cc8b03dd 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/barChart/BarChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.barChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.PythonOperatorDescriptor @@ -53,10 +53,9 @@ class BarChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala index 5df97e865a1..ca55d3b7bd5 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/boxPlot/BoxPlotOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.boxPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -40,10 +40,9 @@ class BoxPlotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala index 3a4db9d8e91..fb0dd91e6c0 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/bubbleChart/BubbleChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.bubbleChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} @@ -46,10 +46,9 @@ class BubbleChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala index 80ee1ff31e1..15e620d8a97 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/candlestickChart/CandlestickChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.candlestickChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -44,10 +44,9 @@ class CandlestickChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala index 78d818cc161..ac74425eb62 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/continuousErrorBands/ContinuousErrorBandsOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.continuousErrorBands import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @@ -28,10 +28,9 @@ class ContinuousErrorBandsOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala index 0a132c2c996..721da590564 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/contourPlot/ContourPlotOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.contourPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -49,10 +49,9 @@ class ContourPlotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala index bac0482bf8a..433f578fd6e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/dumbbellPlot/DumbbellPlotOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.dumbbellPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -62,10 +62,9 @@ class DumbbellPlotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala index 32c250b55dd..0f2fe00eeed 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/figureFactoryTable/FigureFactoryTableOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.figureFactoryTable import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @@ -107,10 +107,9 @@ class FigureFactoryTableOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala index 2e4e0691a08..fa7bc133aeb 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/filledAreaPlot/FilledAreaPlotOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.filledAreaPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -49,10 +49,9 @@ class FilledAreaPlotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala index a7e8075edff..3a82e95e702 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/funnelPlot/FunnelPlotOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.funnelPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -38,10 +38,9 @@ class FunnelPlotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala index 382035b3d64..0cafc4e2143 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ganttChart/GanttChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.ganttChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -56,10 +56,9 @@ class GanttChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala index 3b623fbccc3..29f1dd8af23 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/heatMap/HeatMapOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.heatMap import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -31,10 +31,9 @@ class HeatMapOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala index 3e09d51484c..20c06c3a677 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/hierarchychart/HierarchyChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.hierarchychart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @@ -40,10 +40,9 @@ class HierarchyChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala index 829f5355224..d63944e319e 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/histogram/HistogramChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.histogram import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -97,10 +97,9 @@ class HistogramChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala index 5d84d7e548b..1abb6fa8a28 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/htmlviz/HtmlVizOpDesc.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.visualization.htmlviz import com.fasterxml.jackson.annotation.JsonProperty import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -39,10 +39,7 @@ class HtmlVizOpDesc extends LogicalOp { .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( SchemaPropagationFunc(inputSchemas => { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema().add("html-content", AttributeType.STRING) Map(operatorInfo.outputPorts.head.id -> outputSchema) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala index 69eb7f83d12..37f8780fce9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/lineChart/LineChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.lineChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @@ -29,10 +29,9 @@ class LineChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala index 5ff0bfa88ae..400ce53f1e6 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/pieChart/PieChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.pieChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} @@ -36,10 +36,9 @@ class PieChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala index 58f8759594b..a6e52ac5917 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/quiverPlot/QuiverPlotOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.quiverPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} @@ -45,10 +45,9 @@ class QuiverPlotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala index ca8cff0cdcb..d468c64b6c8 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/sankeyDiagram/SankeyDiagramOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.sankeyDiagram import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -32,10 +32,9 @@ class SankeyDiagramOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala index 2a42c2c84dc..5ce75d46e74 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatter3DChart/Scatter3dChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.scatter3DChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -37,10 +37,9 @@ class Scatter3dChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala index d56a2c45d1d..1d6fa4a4068 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/scatterplot/ScatterplotOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.scatterplot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -63,10 +63,9 @@ class ScatterplotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala index 87d174d01e9..a8a30c2e7b9 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/tablesChart/TablesPlotOpDesc.scala @@ -1,7 +1,7 @@ package edu.uci.ics.amber.operator.visualization.tablesChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode @@ -83,10 +83,9 @@ class TablesPlotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala index 2840ea421da..8d6d9483d25 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/ternaryPlot/TernaryPlotOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.ternaryPlot import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} @@ -60,10 +60,9 @@ class TernaryPlotOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala index 90482deaaa5..f89154a3810 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/urlviz/UrlVizOpDesc.scala @@ -3,7 +3,7 @@ package edu.uci.ics.amber.operator.visualization.urlviz import com.fasterxml.jackson.annotation.JsonProperty import com.kjetland.jackson.jsonSchema.annotations.{JsonSchemaInject, JsonSchemaTitle} import edu.uci.ics.amber.core.executor.OpExecWithClassName -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PhysicalOp, SchemaPropagationFunc} import edu.uci.ics.amber.operator.LogicalOp import edu.uci.ics.amber.core.virtualidentity.{ExecutionIdentity, WorkflowIdentity} @@ -50,10 +50,7 @@ class UrlVizOpDesc extends LogicalOp { .withOutputPorts(operatorInfo.outputPorts) .withPropagateSchema( SchemaPropagationFunc(_ => { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema().add("html-content", AttributeType.STRING) Map(operatorInfo.outputPorts.head.id -> outputSchema) }) ) diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala index 2ba19165765..8aa21b9e43a 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/waterfallChart/WaterfallChartOpDesc.scala @@ -2,7 +2,7 @@ package edu.uci.ics.amber.operator.visualization.waterfallChart import com.fasterxml.jackson.annotation.{JsonProperty, JsonPropertyDescription} import com.kjetland.jackson.jsonSchema.annotations.JsonSchemaTitle -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName @@ -26,10 +26,9 @@ class WaterfallChartOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala index e6e2c408e48..d69013040f4 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/visualization/wordCloud/WordCloudOpDesc.scala @@ -6,13 +6,13 @@ import com.kjetland.jackson.jsonSchema.annotations.{ JsonSchemaInt, JsonSchemaTitle } -import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema} +import edu.uci.ics.amber.core.tuple.{AttributeType, Schema} +import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode +import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} import edu.uci.ics.amber.operator.PythonOperatorDescriptor import edu.uci.ics.amber.operator.metadata.annotations.AutofillAttributeName import edu.uci.ics.amber.operator.metadata.{OperatorGroupConstants, OperatorInfo} import edu.uci.ics.amber.operator.visualization.ImageUtility -import edu.uci.ics.amber.core.workflow.OutputPort.OutputMode -import edu.uci.ics.amber.core.workflow.{InputPort, OutputPort, PortIdentity} class WordCloudOpDesc extends PythonOperatorDescriptor { @JsonProperty(required = true) @JsonSchemaTitle("Text column") @@ -27,10 +27,9 @@ class WordCloudOpDesc extends PythonOperatorDescriptor { override def getOutputSchemas( inputSchemas: Map[PortIdentity, Schema] ): Map[PortIdentity, Schema] = { - val outputSchema = Schema - .builder() - .add(new Attribute("html-content", AttributeType.STRING)) - .build() + val outputSchema = Schema() + .add("html-content", AttributeType.STRING) + Map(operatorInfo.outputPorts.head.id -> outputSchema) Map(operatorInfo.outputPorts.head.id -> outputSchema) } diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpExecSpec.scala index 60725d53295..29d79438132 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/cartesianProduct/CartesianProductOpExecSpec.scala @@ -38,7 +38,7 @@ class CartesianProductOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .map(num => new Attribute(base_name + (if (append_num) "#@" + num else ""), AttributeType.STRING) ) - Schema.builder().add(attrs).build() + Schema().add(attrs) } before { @@ -93,16 +93,10 @@ class CartesianProductOpExecSpec extends AnyFlatSpec with BeforeAndAfter { val numRightTuples: Int = 3 val duplicateAttribute: Attribute = new Attribute("left", AttributeType.STRING) - val leftSchema = Schema - .builder() - .add(generate_schema("left", numLeftSchemaAttributes - 1)) + val leftSchema = generate_schema("left", numLeftSchemaAttributes - 1) .add(duplicateAttribute) - .build() - val rightSchema = Schema - .builder() - .add(generate_schema("right", numRightSchemaAttributes - 1)) + val rightSchema = generate_schema("right", numRightSchemaAttributes - 1) .add(duplicateAttribute) - .build() val inputSchemas = Map(PortIdentity() -> leftSchema, PortIdentity(1) -> rightSchema) val outputSchema = opDesc.getExternalOutputSchemas(inputSchemas).values.head diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala index 1d19700e071..2a4d6745523 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/dictionary/DictionaryMatcherOpExecSpec.scala @@ -7,12 +7,10 @@ import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class DictionaryMatcherOpExecSpec extends AnyFlatSpec with BeforeAndAfter { - val tupleSchema: Schema = Schema - .builder() + val tupleSchema: Schema = Schema() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("field2", AttributeType.INTEGER)) .add(new Attribute("field3", AttributeType.BOOLEAN)) - .build() val tuple: Tuple = Tuple .builder(tupleSchema) diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/difference/DifferenceOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/difference/DifferenceOpExecSpec.scala index 37aa0e35004..6f1813c7319 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/difference/DifferenceOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/difference/DifferenceOpExecSpec.scala @@ -8,14 +8,10 @@ class DifferenceOpExecSpec extends AnyFlatSpec with BeforeAndAfter { var input2: Int = 1 var opExec: DifferenceOpExec = _ var counter: Int = 0 - val schema: Schema = Schema - .builder() - .add( - new Attribute("field1", AttributeType.STRING), - new Attribute("field2", AttributeType.INTEGER), - new Attribute("field3", AttributeType.BOOLEAN) - ) - .build() + val schema: Schema = Schema() + .add(new Attribute("field1", AttributeType.STRING)) + .add(new Attribute("field2", AttributeType.INTEGER)) + .add(new Attribute("field3", AttributeType.BOOLEAN)) def tuple(): Tuple = { counter += 1 diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/distinct/DistinctOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/distinct/DistinctOpExecSpec.scala index 8f397a6dc49..865693f274e 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/distinct/DistinctOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/distinct/DistinctOpExecSpec.scala @@ -4,12 +4,10 @@ import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec import edu.uci.ics.amber.core.tuple.{Attribute, AttributeType, Schema, Tuple, TupleLike} class DistinctOpExecSpec extends AnyFlatSpec with BeforeAndAfter { - val tupleSchema: Schema = Schema - .builder() + val tupleSchema: Schema = Schema() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("field2", AttributeType.INTEGER)) .add(new Attribute("field3", AttributeType.BOOLEAN)) - .build() val tuple: () => Tuple = () => Tuple diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala index a17642c8286..47dc26e1cc7 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/filter/SpecializedFilterOpExecSpec.scala @@ -14,19 +14,17 @@ class SpecializedFilterOpExecSpec extends AnyFlatSpec with BeforeAndAfter { .map(attributeType => Tuple .builder( - Schema.builder().add(new Attribute(attributeType.name(), attributeType)).build() + Schema().add(new Attribute(attributeType.name(), attributeType)) ) .add(new Attribute(attributeType.name(), attributeType), null) .build() ) - val tupleSchema: Schema = Schema - .builder() + val tupleSchema: Schema = Schema() .add(new Attribute("string", AttributeType.STRING)) .add(new Attribute("int", AttributeType.INTEGER)) .add(new Attribute("bool", AttributeType.BOOLEAN)) .add(new Attribute("long", AttributeType.LONG)) - .build() val allNullTuple: Tuple = Tuple .builder(tupleSchema) diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala index c3cd8f6ecd3..c31d5eb25ab 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/hashJoin/HashJoinOpSpec.scala @@ -23,11 +23,10 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { var opDesc: HashJoinOpDesc[String] = _ def getInternalHashTableSchema(buildInputSchema: Schema): Schema = { - Schema - .builder() + Schema() .add(HASH_JOIN_INTERNAL_KEY_NAME, AttributeType.ANY) .add(buildInputSchema) - .build() + } def tuple(name: String, n: Int = 1, i: Option[Int]): Tuple = { @@ -39,13 +38,10 @@ class HashJoinOpSpec extends AnyFlatSpec with BeforeAndAfter { } def schema(name: String, n: Int = 1): Schema = { - Schema - .builder() - .add( - new Attribute(name, AttributeType.STRING), - new Attribute(name + "_" + n, AttributeType.STRING) - ) - .build() + Schema() + .add(new Attribute(name, AttributeType.STRING)) + .add(new Attribute(name + "_" + n, AttributeType.STRING)) + } it should "work with basic two input streams with different buildAttributeName and probeAttributeName" in { diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intersect/IntersectOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intersect/IntersectOpExecSpec.scala index 03c10c310b6..2310b8c5ccf 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intersect/IntersectOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intersect/IntersectOpExecSpec.scala @@ -11,14 +11,10 @@ class IntersectOpExecSpec extends AnyFlatSpec with BeforeAndAfter { var opExec: IntersectOpExec = _ var counter: Int = 0 - val tupleSchema: Schema = Schema - .builder() + val tupleSchema: Schema = Schema() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("field2", AttributeType.INTEGER)) - .add( - new Attribute("field3", AttributeType.BOOLEAN) - ) - .build() + .add(new Attribute("field3", AttributeType.BOOLEAN)) def physicalOpId(): PhysicalOpIdentity = { counter += 1 diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala index e8c26d84a23..73090da01b8 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/intervalJoin/IntervalOpExecSpec.scala @@ -62,13 +62,10 @@ class IntervalOpExecSpec extends AnyFlatSpec with BeforeAndAfter { } def schema(name: String, attributeType: AttributeType, n: Int = 1): Schema = { - Schema - .builder() - .add( - new Attribute(name, attributeType), - new Attribute(name + "_" + n, attributeType) - ) - .build() + Schema() + .add(new Attribute(name, attributeType)) + .add(new Attribute(name + "_" + n, attributeType)) + } def longTuple(name: String, n: Int = 1, i: Long): Tuple = { diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExecSpec.scala index c0e12804e7c..60bc0471532 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/keywordSearch/KeywordSearchOpExecSpec.scala @@ -9,10 +9,8 @@ class KeywordSearchOpExecSpec extends AnyFlatSpec with BeforeAndAfter { val inputPort: Int = 0 val opDesc: KeywordSearchOpDesc = new KeywordSearchOpDesc() - val schema: Schema = Schema - .builder() + val schema: Schema = Schema() .add(new Attribute("text", AttributeType.STRING)) - .build() def createTuple(text: String): Tuple = { Tuple diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExecSpec.scala index edd889734d7..bdd5a58f94d 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/projection/ProjectionOpExecSpec.scala @@ -5,12 +5,10 @@ import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class ProjectionOpExecSpec extends AnyFlatSpec with BeforeAndAfter { - val tupleSchema: Schema = Schema - .builder() + val tupleSchema: Schema = Schema() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("field2", AttributeType.INTEGER)) .add(new Attribute("field3", AttributeType.BOOLEAN)) - .build() val tuple: Tuple = Tuple .builder(tupleSchema) @@ -38,11 +36,9 @@ class ProjectionOpExecSpec extends AnyFlatSpec with BeforeAndAfter { new AttributeUnit("field2", "f2"), new AttributeUnit("field1", "f1") ) - val outputSchema = Schema - .builder() + val outputSchema = Schema() .add(new Attribute("f1", AttributeType.STRING)) .add(new Attribute("f2", AttributeType.INTEGER)) - .build() val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) projectionOpExec.open() @@ -64,11 +60,9 @@ class ProjectionOpExecSpec extends AnyFlatSpec with BeforeAndAfter { new AttributeUnit("field3", "f3"), new AttributeUnit("field1", "f1") ) - val outputSchema = Schema - .builder() + val outputSchema = Schema() .add(new Attribute("f3", AttributeType.BOOLEAN)) .add(new Attribute("f1", AttributeType.STRING)) - .build() val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) projectionOpExec.open() @@ -121,11 +115,9 @@ class ProjectionOpExecSpec extends AnyFlatSpec with BeforeAndAfter { new AttributeUnit("field2", "f2"), new AttributeUnit("field1", "") ) - val outputSchema = Schema - .builder() + val outputSchema = Schema() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("f2", AttributeType.INTEGER)) - .build() val projectionOpExec = new ProjectionOpExec(objectMapper.writeValueAsString(opDesc)) projectionOpExec.open() diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExecSpec.scala index aeab7443c1d..6a7966d9e0c 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/sortPartitions/SortPartitionsOpExecSpec.scala @@ -5,12 +5,10 @@ import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class SortPartitionsOpExecSpec extends AnyFlatSpec with BeforeAndAfter { - val tupleSchema: Schema = Schema - .builder() + val tupleSchema: Schema = Schema() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("field2", AttributeType.INTEGER)) .add(new Attribute("field3", AttributeType.BOOLEAN)) - .build() val tuple: Int => Tuple = i => Tuple diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpExecSpec.scala index d53e857285f..f0e45a713de 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/symmetricDifference/SymmetricDifferenceOpExecSpec.scala @@ -13,14 +13,10 @@ import edu.uci.ics.amber.core.tuple.{ class SymmetricDifferenceOpExecSpec extends AnyFlatSpec with BeforeAndAfter { var opExec: SymmetricDifferenceOpExec = _ var counter: Int = 0 - val schema: Schema = Schema - .builder() - .add( - new Attribute("field1", AttributeType.STRING), - new Attribute("field2", AttributeType.INTEGER), - new Attribute("field3", AttributeType.BOOLEAN) - ) - .build() + val schema: Schema = Schema() + .add(new Attribute("field1", AttributeType.STRING)) + .add(new Attribute("field2", AttributeType.INTEGER)) + .add(new Attribute("field3", AttributeType.BOOLEAN)) def tuple(): Tuple = { counter += 1 diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala index 39b3b5e8d60..83487206e06 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/typecasting/TypeCastingOpExecSpec.scala @@ -5,21 +5,18 @@ import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class TypeCastingOpExecSpec extends AnyFlatSpec with BeforeAndAfter { - val tupleSchema: Schema = Schema - .builder() + val tupleSchema: Schema = Schema() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("field2", AttributeType.INTEGER)) .add(new Attribute("field3", AttributeType.BOOLEAN)) .add(new Attribute("field4", AttributeType.LONG)) - .build() - val castToSchema: Schema = Schema - .builder() + val castToSchema: Schema = Schema() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("field2", AttributeType.STRING)) .add(new Attribute("field3", AttributeType.STRING)) .add(new Attribute("field4", AttributeType.LONG)) - .build() + val castingUnit1 = new TypeCastingUnit() castingUnit1.attribute = "field2" castingUnit1.resultType = AttributeType.STRING diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala index d63b82900bb..29905bbccfc 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/operator/unneststring/UnnestStringOpExecSpec.scala @@ -6,12 +6,10 @@ import edu.uci.ics.amber.util.JSONUtils.objectMapper import org.scalatest.BeforeAndAfter import org.scalatest.flatspec.AnyFlatSpec class UnnestStringOpExecSpec extends AnyFlatSpec with BeforeAndAfter { - val tupleSchema: Schema = Schema - .builder() + val tupleSchema: Schema = Schema() .add(new Attribute("field1", AttributeType.STRING)) .add(new Attribute("field2", AttributeType.INTEGER)) .add(new Attribute("field3", AttributeType.STRING)) - .build() val tuple: Tuple = Tuple .builder(tupleSchema) diff --git a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/util/ArrowUtilsSpec.scala b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/util/ArrowUtilsSpec.scala index 02367732418..bc20e11ab86 100644 --- a/core/workflow-operator/src/test/scala/edu/uci/ics/amber/util/ArrowUtilsSpec.scala +++ b/core/workflow-operator/src/test/scala/edu/uci/ics/amber/util/ArrowUtilsSpec.scala @@ -25,15 +25,13 @@ class ArrowUtilsSpec extends AnyFlatSpec { val timestamp = new ArrowType.Timestamp(TimeUnit.MILLISECOND, "UTC") val string: ArrowType.Utf8 = ArrowType.Utf8.INSTANCE - val texeraSchema: Schema = Schema - .builder() + val texeraSchema: Schema = Schema() .add("test-1", AttributeType.INTEGER) .add("test-2", AttributeType.LONG) .add("test-3", AttributeType.BOOLEAN) .add("test-4", AttributeType.DOUBLE) .add("test-5", AttributeType.TIMESTAMP) .add("test-6", AttributeType.STRING) - .build() val arrowSchema: org.apache.arrow.vector.types.pojo.Schema = new org.apache.arrow.vector.types.pojo.Schema( From f2aeb0a189922cc990040b24947ffb3efa85a511 Mon Sep 17 00:00:00 2001 From: Yicong Huang <17627829+Yicong-Huang@users.noreply.github.com> Date: Wed, 1 Jan 2025 13:46:21 -0800 Subject: [PATCH 28/47] Fix python udf source detection (#3189) PhysicalOp relies on the input port number to determine if an operator is a source operator. For Python UDF, from the changes in #3183, the input ports are not correctly associated with the PhysicalOp, causing all the Python UDFs to be recognized as source operators. This PR fixes the issue. --- .../architecture/pythonworker/PythonProxyClient.scala | 2 +- .../ics/amber/operator/udf/python/PythonUDFOpDescV2.scala | 7 +++++-- 2 files changed, 6 insertions(+), 3 deletions(-) diff --git a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala index c7dc6400c1e..61a1a2641d9 100644 --- a/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala +++ b/core/amber/src/main/scala/edu/uci/ics/amber/engine/architecture/pythonworker/PythonProxyClient.scala @@ -69,7 +69,7 @@ class PythonProxyClient(portNumberPromise: Promise[Int], val actorId: ActorVirtu logger.warn( s"Failed to connect to Flight Server in this attempt, retrying after $UNIT_WAIT_TIME_MS ms... remaining attempts: ${MAX_TRY_COUNT - tryCount}" ) - flightClient.close() + if (flightClient != null) flightClient.close() Thread.sleep(UNIT_WAIT_TIME_MS) tryCount += 1 } diff --git a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala index 1c070636ebd..802b8d7d544 100644 --- a/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala +++ b/core/workflow-operator/src/main/scala/edu/uci/ics/amber/operator/udf/python/PythonUDFOpDescV2.scala @@ -93,7 +93,7 @@ class PythonUDFOpDescV2 extends LogicalOp { Map(operatorInfo.outputPorts.head.id -> outputSchema) } - if (workers > 1) { + val physicalOp = if (workers > 1) { PhysicalOp .oneToOnePhysicalOp( workflowId, @@ -112,7 +112,10 @@ class PythonUDFOpDescV2 extends LogicalOp { OpExecWithCode(code, "python") ) .withParallelizable(false) - }.withDerivePartition(_ => UnknownPartition()) + } + + physicalOp + .withDerivePartition(_ => UnknownPartition()) .withInputPorts(operatorInfo.inputPorts) .withOutputPorts(operatorInfo.outputPorts) .withPartitionRequirement(partitionRequirement) From 19644b4135a213f44a8dc454c108eff3d6c404d2 Mon Sep 17 00:00:00 2001 From: Shengquan Ni <13672781+shengquan-ni@users.noreply.github.com> Date: Mon, 6 Jan 2025 13:26:52 -0800 Subject: [PATCH 29/47] Fix CI failures by pining the ubuntu version for backend CI (#3194) The ubuntu-latest image has been updated to 24.04 from 22.04 in recent days. However, the new image is incompatible with libncurses5, requiring an upgrade to libncurses6. Unfortunately, after upgrading, sbt no longer functions as expected, an issue also documented here: [actions/setup-java#712](https://github.com/actions/setup-java/issues/712). It appears that the 24.04 image does not include sbt by default. This PR addresses the issue by pinning the image to ubuntu-22.04. We can revisit and update the version when the 24.04 image becomes more stable and resolves these compatibility problems. --- .github/workflows/github-action-build.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/workflows/github-action-build.yml b/.github/workflows/github-action-build.yml index 351064fb4a2..4c07bd728ca 100644 --- a/.github/workflows/github-action-build.yml +++ b/.github/workflows/github-action-build.yml @@ -56,7 +56,7 @@ jobs: core: strategy: matrix: - os: [ ubuntu-latest ] + os: [ ubuntu-22.04 ] java-version: [ 11 ] runs-on: ${{ matrix.os }} env: From a1186f8347daa05927b72c6a70e9a35b36367692 Mon Sep 17 00:00:00 2001 From: yunyad <114192306+yunyad@users.noreply.github.com> Date: Tue, 7 Jan 2025 09:23:18 -0800 Subject: [PATCH 30/47] Add avatar in execution history dashboard (#3196) In this PR, we add the user avatar to the execution history panel. Screenshot 2025-01-06 at 3 53 23 PM --- .../user/workflow/WorkflowExecutionsResource.scala | 11 +++++------ .../workflow-execution-history.component.html | 9 ++++----- .../app/dashboard/type/workflow-executions-entry.ts | 1 + 3 files changed, 10 insertions(+), 11 deletions(-) diff --git a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala index 132e9bdd7b3..7b51e853fdc 100644 --- a/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala +++ b/core/amber/src/main/scala/edu/uci/ics/texera/web/resource/dashboard/user/workflow/WorkflowExecutionsResource.scala @@ -96,6 +96,7 @@ object WorkflowExecutionsResource { eId: UInteger, vId: UInteger, userName: String, + googleAvatar: String, status: Byte, result: String, startingTime: Timestamp, @@ -193,12 +194,8 @@ class WorkflowExecutionsResource { .select( WORKFLOW_EXECUTIONS.EID, WORKFLOW_EXECUTIONS.VID, - field( - context - .select(USER.NAME) - .from(USER) - .where(WORKFLOW_EXECUTIONS.UID.eq(USER.UID)) - ), + USER.NAME, + USER.GOOGLE_AVATAR, WORKFLOW_EXECUTIONS.STATUS, WORKFLOW_EXECUTIONS.RESULT, WORKFLOW_EXECUTIONS.STARTING_TIME, @@ -210,6 +207,8 @@ class WorkflowExecutionsResource { .from(WORKFLOW_EXECUTIONS) .join(WORKFLOW_VERSION) .on(WORKFLOW_VERSION.VID.eq(WORKFLOW_EXECUTIONS.VID)) + .join(USER) + .on(WORKFLOW_EXECUTIONS.UID.eq(USER.UID)) .where(WORKFLOW_VERSION.WID.eq(wid)) .fetchInto(classOf[WorkflowExecutionEntry]) .asScala diff --git a/core/gui/src/app/dashboard/component/user/user-workflow/ngbd-modal-workflow-executions/workflow-execution-history.component.html b/core/gui/src/app/dashboard/component/user/user-workflow/ngbd-modal-workflow-executions/workflow-execution-history.component.html index af59d739c4a..1c6134675b3 100644 --- a/core/gui/src/app/dashboard/component/user/user-workflow/ngbd-modal-workflow-executions/workflow-execution-history.component.html +++ b/core/gui/src/app/dashboard/component/user/user-workflow/ngbd-modal-workflow-executions/workflow-execution-history.component.html @@ -143,11 +143,10 @@ nzType="star"> - +