Tag Results

Items tagged with "rapidminer" (57)

Note: some items may not be visible to you, due to viewing permissions.


Groups (1)

Network-member RapidMiner Demo

Unique name: rapidminer_demo
Created: Saturday 01 May 2010 @ 09:49:28 (GMT)

This group is created for demo processes used in RapidMiner documentation, manual, or training. Feel free to add workflows if you consider them instructive examples.

5 shared items   |   1 announcements

Members (27):

Tags:

Latest announcement:: RCOMM 2011

Files (2)
Uploader

Blob Datasets for the pack: RCOMM2011 recommender systems...

Created: 05/05/11 @ 21:18:51 | Last updated: 06/05/11 @ 12:13:22

Credits: User Matko Bošnjak User Ninoaf

License: Creative Commons Attribution-Share Alike 3.0 Unported License

Dataset description: items This is a concatenated train and test set from ECML/PKDD Discovery Challenge 2011. Only ID and name attributes were used, other attributes are discarded because of the size of the dataset. This example set represents the content information for each of the items represented by an ID. user_history This is an example set consisting of randomly sampled IDs from items dataset. It represents the user's history - all the items (in this case lectures) he has viewed. u...

File type: ZIP archive

Rating: 0.0 / 5 (0 ratings) | Comments: 0 | Viewed: 118 times | Downloaded: 97 times

Tags:

Uploader

Blob Experimental user to item score matrix Excel file

Created: 26/11/11 @ 20:07:02 | Last updated: 26/11/11 @ 20:07:04

Credits: User Matko Bošnjak

License: Creative Commons Attribution-Share Alike 3.0 Unported License

 A test file for Collaborative filtering recommender

File type: Excel workbook

Rating: 0.0 / 5 (0 ratings) | Comments: 0 | Viewed: 38 times | Downloaded: 23 times

Tags:

Workflows (45)
Original Uploader

Workflow Image Mining with RapidMiner (1)

Created: 28/04/10 @ 11:00:37 | Last updated: 16/01/12 @ 14:16:23

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is an image mining process using the image mining Web service provided by NHRF within e-Lico. It first uploads a set of images found in a directory, then preprocesses the images and visualizes the result. Furthermore, references to the uploaded images are stored in the local RapidMiner repository so they can later be used for further processing without uploading images a second time.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 1 | Citations: 0

Viewed: 1109 times | Downloaded: 469 times

Tags (5):

Original Uploader

Workflow Looping over Examples for doing de-aggrega... (1)

Created: 29/04/10 @ 16:21:56

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process is based on (artificially generated) data that looks like it has been aggregated before. The integer attribute Qty specifies the quantity of the given item that is represented by the rest of the example. The process now loops over every example and performs on each example another loop, that will append the current example to a new example set. This example set has been created as empty copy of the original example set, so that the attributes are equally. To get access to and rem...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 291 times | Downloaded: 362 times

Tags (5):

Original Uploader

Workflow Using Remember / Recall for "tunneling" re... (1)

Created: 29/04/10 @ 16:07:55 | Last updated: 16/01/12 @ 16:35:12

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process shows how Remeber and Recall operators can be used for passing results from one position to another position in the process, when it's impossible to make a direct connection. This process introduces another advanced RapidMiner technique: The macro handling. We have used the predefined macro a, accessed by %{a}, that gives the apply count of the operator. So we are remembering each application of the models that are generated in the learning subprocess of the Split validation. Af...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 215 times | Downloaded: 73 times

Tags (5):

Original Uploader

Workflow CamelCases (1)

Created: 02/06/10 @ 12:33:44

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
this process splits up camelcases

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 62 times | Downloaded: 28 times

Tags (2):

Original Uploader

Workflow Connect to twitter and analyze the key words (1)

Created: 26/07/10 @ 04:12:58 | Last updated: 26/07/10 @ 04:23:35

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
Hi All, This workflow connects RapidMiner to Twitter and downloads the timeline. It then creates a wordlist from the tweets and breaks them into key words that are mentioned in the tweets. You can then visualize the key words mentioned in the tweets. This workflow can be further modified to review various key events that have been talked about in the twitterland. Do let me know your feedback and feel free to ask me any questions that you may have. Shaily web: http://advanced-analyti...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 775 times | Downloaded: 708 times

Tags (7):

Original Uploader

Workflow 2. Getting Started: Retrieve and Apply a M... (1)

Created: 17/01/11 @ 08:50:27 | Last updated: 19/01/11 @ 09:46:40

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This getting started process demonstrates how to load (retrieve) a model from the repository and apply it to a data set. The result is a data set (at the lab output for "labeled data" ) with has a new "prediction" attribute which indicated the prediction for each example (ie. row/record). You will need to adjust the path of the retrieve data operator to the actual location where the model is stored by a previews execution of the "1. Getting Started: Learn and Store a...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 166 times | Downloaded: 62 times

Tags (7):

Original Uploader

Workflow 1. Getting Started: Learn and Store a Model (1)

Created: 17/01/11 @ 08:40:34 | Last updated: 17/01/11 @ 08:56:57

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This getting started process shows the first step of learning and storing a model. After a model is learned, you can load (Retrieve operator) the model and apply it to a test data set (see 2. Getting Started: Retrieve and Apply Model). The process is NOT concerned with evaluation of the model. This process will not immediately run in RapidMiner because you have to adjust the repository path in the Retrieve operator. Tags: Rapidminer, model, learn, learning, training, train, store, first step

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 101 times | Downloaded: 38 times

Tags (9):

Original Uploader

Workflow Change Class Distribution of Your Training... (1)

Created: 21/01/11 @ 14:57:11 | Last updated: 21/01/11 @ 14:57:12

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This example process shows how to change the class distribution of your training data set (in this case the training data is what ever comes out of the "myData reader"). The given training set has a distribution of 10 "Iris-setosa" examples, 40 "Iris-versicolor" examples and 50 "Iris-virginica" examples. The aim is to get a data set which has the class distribution for the label, lets say 10 "Iris-setosa", 20 "Iris-versicolor" and 20 "Iris-virginica. Beware that this may change some propert...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 85 times | Downloaded: 37 times

Tags (22):

Original Uploader

Workflow Random recommender (1)

Created: 15/03/11 @ 15:22:49 | Last updated: 15/03/11 @ 15:28:09

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process does a random item recommendation; for a given item ID, from the example set of items, it randomly recommends a desired number of items. The purpose of this workflow is to produce a random recommendation baseline for comparison with different recommendation solutions, on different retrieval measures. The inputs to the process are context defined macros: %{id} defines an item ID for which we would like to obtain recommendation and %{recommender_no} defines the required number of ...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 90 times | Downloaded: 28 times

Tags (5):

Original Uploader

Workflow Collaborative filtering recommender (1)

Created: 15/03/11 @ 15:27:10 | Last updated: 06/03/12 @ 13:07:22

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process executes a collaborative filtering recommender based on user to item score matrix. This recommender predicts one user’s score on some of his non scored items based on similarity with other users. The inputs to the process are context defined macros: %{id} defines an item ID for which we would like to obtain recommendation and %{recommender_no} defines the required number of recommendations and %{number_of_neighbors} defines the number of the most similar users taken into a...

Rating: 1.0 / 5 (1 rating) | Versions: 1 | Reviews: 0 | Comments: 1 | Citations: 0

Viewed: 164 times | Downloaded: 80 times

Tags (5):

Original Uploader

Workflow Content based recommender (1)

Created: 15/03/11 @ 15:24:43 | Last updated: 15/03/11 @ 15:29:48

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process is a special case of the item to item similarity matrix based recommender where the item to item similarity is calculated as cosine similarity over TF-IDF word vectors obtained from the textual analysis over all the available textual data. The inputs to the process are context defined macros: %{id} defines an item ID for which we would like to obtain recommendation and %{recommender_no} defines the required number of recommendations. The process internally uses an example set of...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 172 times | Downloaded: 74 times

Tags (8):

Original Uploader

Workflow Item to item similarity matrix -based reco... (1)

Created: 15/03/11 @ 15:23:54 | Last updated: 15/03/11 @ 15:30:08

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process executes the recommendation based on item to item similarity matrix. The inputs to the process are context defined macros: %{id} defines an item ID for which we would like to obtain recommendation and %{recommender_no} defines the required number of recommendations. The process internally uses an item to item similarity matrix written in pairwise form (id1, id2, similarity). The process essentially filters out appearances of the required ID in both of the columns of the pairwis...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 1 | Citations: 0

Viewed: 130 times | Downloaded: 60 times

Tags (5):

Original Uploader

Workflow Content based recommender system template (1)

Created: 05/05/11 @ 21:06:32 | Last updated: 09/05/11 @ 13:40:24

Credits: User Matko Bošnjak User Ninoaf

Attributions: Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
As an input, this workflow takes two distinct example sets: a complete set of items with IDs and appropriate textual attributes (item example set) and a set of IDs of items our user had interaction with (user example set). Also, a macro %{recommendation_no} is defined in the process context, as a required number of outputted recommendations. The first steps of the workflow are to preprocess those example sets; select only textual attributes of item example set, and set ID roles on both of th...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 1 | Citations: 0

Viewed: 180 times | Downloaded: 84 times

Tags (5):

Original Uploader

Workflow Item-based collaborative filtering recomme... (1)

Created: 05/05/11 @ 21:07:01 | Last updated: 09/05/11 @ 13:43:09

Credits: User Matko Bošnjak User Ninoaf

Attributions: Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
The workflow for item-based collaborative filtering receives a user-item matrix for its input, and the same context defined macros as the user-based recommender template, namely %{id}, %{recommendation_no}, and %{number_of_neighbors}. Although this process is in theory very similar to user-based technique, it differs in several processing steps since we are dealing with an item-user matrix, the transposed user-item example set. The first step of the workflow, after declaring zero values miss...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 151 times | Downloaded: 85 times

Tags (5):

Original Uploader

Workflow User-based collaborative filtering recomme... (1)

Created: 05/05/11 @ 21:06:45 | Last updated: 09/05/11 @ 13:45:33

Credits: User Matko Bošnjak User Ninoaf

Attributions: Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
The workflow for user-based collaborative filtering, takes only one example set as an input: a user-item matrix, where the attributes denote item IDs, and rows denote users. If a user i has rated an item j with a score s, the matrix will have the value s written in i-th row and j-th column. In the context of the process we define the ID of the user %{id}, desired number of recommendations %{recommendation_no}, and the number of neighbors used in ca...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 151 times | Downloaded: 72 times

Tags (5):

Original Uploader

Workflow SVD user-based collaborative filtering rec... (1)

Created: 09/05/11 @ 12:45:05 | Last updated: 09/05/11 @ 13:59:45

Credits: User Ninoaf User Matko Bošnjak

Attributions: Workflow User-based collaborative filtering recommender system template Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow takes user-item matrix A as a input. Then it calculates reduced SVD decomposition A_k by taking only k greatest singular values and corresponding singular vectors. This worfkflow calculates recommendations and predictions for particular user %{id} from matrix A. Particular row %{id} is taken from original matrix A and replaced with %{id} row in A_k matrix. Predictions are made for %{id} user based on another users A_k. Note: This workflow uses R-script operator with R library ...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 183 times | Downloaded: 71 times

Tags (7):

Original Uploader

Workflow LSI content based recommender system template (1)

Created: 06/05/11 @ 20:40:24 | Last updated: 09/05/11 @ 13:59:18

Credits: User Ninoaf User Matko Bošnjak

Attributions: Workflow Content based recommender system template Blob Datasets for the pack: RCOMM2011 recommender systems workflow templates

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow performs LSI text-mining content based recommendation. We use SVD to capture latent semantics between items and words and to obtain low-dimensional representation of items. Latent Semantic Indexing (LSI) takes k greatest singular values and left and right singular vectors to obtain matrix  A_k=U_k * S_k * V_k^T. Items are represented as word-vectors in the original space, where each row in matrix A represents word-vector of particular item. Matrix U_k, on the other hand ...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 200 times | Downloaded: 110 times

Tags (8):

Original Uploader

Workflow Mining Semantic Web data using FastMap - E... (1)

Created: 25/06/11 @ 12:11:37 | Last updated: 25/06/11 @ 13:17:51

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow describes how to learn from the Semantic Web's data. The input to the workflow is a feature vector developed from a RDF resource. The loaded example set is then divided into training and test parts. These sub-example sets are used by the FastMap operators (encapsulate the FastMap data transformation technique), which processes each feature at a time and transform the data into a different space. This transformed data is more meaningful and helps the learner to improve classfica...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 1

Viewed: 71 times | Downloaded: 33 times

Tags (7):

Original Uploader

Workflow Mining Semantic Web data using Corresponde... (1)

Created: 25/06/11 @ 14:10:49 | Last updated: 25/06/11 @ 14:22:30

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow describes how to learn from the Semantic Web's data using a data transformation algorithm 'Correspondence Analysis'. The input to the workflow is a feature vector developed from a RDF resource. The loaded example set is divided into training and test parts. These sub-example sets are used by the Correspondence Analysis operators (encapsulate the Correspondence Analysis data transformation technique) which processes each feature at a time and transform the data into a different...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 1

Viewed: 106 times | Downloaded: 42 times

Tags (7):

Original Uploader

Workflow Mining Semantic Web data using Corresponde... (1)

Created: 25/06/11 @ 14:48:59 | Last updated: 25/06/11 @ 15:01:59

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow will explain that how an example set can be extracted from an RDF resource using the provided SPARQL query. This example set is then divided into training and test parts. These sub-example sets are used by the Correspondencce Analysis operators (encapsulate the Correspondencce Analysis data transformation technique) which processes each feature at a time and transform the data into a different space. This transformed data is more meaningful and helps the learner to improve clas...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 1

Viewed: 117 times | Downloaded: 39 times

Tags (7):

Original Uploader

Workflow Basic local features extraction with learner (1)

Created: 08/09/11 @ 22:04:03 | Last updated: 09/09/11 @ 09:00:18

Credits: User Staryvena

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process is simple example of local feature extraction with model traininig (using SVM and X-validation). Input is grayscale source image and image mask. White places are true and black false. This workflow needs IMMI - Rapidminer 5 Image Processing Extension.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 82 times | Downloaded: 51 times

Tags (4):

Original Uploader

Workflow Basic local features extraction (1)

Created: 08/09/11 @ 21:54:51 | Last updated: 09/09/11 @ 09:00:58

Credits: User Staryvena

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process is simple example of local feature extraction with no model training. Input is grayscale source image and image mask. White places are true and black false. This workflow needs IMMI - Rapidminer 5 Image Processing Extension.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 73 times | Downloaded: 31 times

Tags (4):

Original Uploader

Workflow Basic local features extraction with proce... (1)

Created: 08/09/11 @ 22:02:19 | Last updated: 09/09/11 @ 08:59:20

Credits: User Staryvena

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process is simple example of local feature extraction with no model training. Input is grayscale source image and image mask. White places are true and black false. Thresholdig operator is used to limit processed pixels only to non black (values greater than 0). This workflow needs IMMI - Rapidminer 5 Image Processing Extension.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 91 times | Downloaded: 25 times

Tags (4):

Original Uploader

Workflow Image mining performace visualize (1)

Created: 09/09/11 @ 10:28:01 | Last updated: 09/09/11 @ 10:29:20

Credits: User Staryvena

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This process is simple example of apply trained model to mined data from image. Results are performance and probability visualization. This workflow needs IMMI - Rapidminer 5 Image Processing Extension.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 92 times | Downloaded: 37 times

Tags (4):

Original Uploader

Workflow Tag Clustering (TaCl) (1)

Created: 17/11/11 @ 16:11:14 | Last updated: 17/11/11 @ 16:11:15

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is a sample process for a tag clustering. See http://www-ai.cs.uni-dortmund.de/SOFTWARE/TaCl/index.html

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 34 times | Downloaded: 13 times

Tags (2):

Original Uploader

Workflow Semantic clustering (with AHC) of SPARQL q... (1)

Created: 29/01/12 @ 09:38:32 | Last updated: 29/01/12 @ 09:39:22

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
The workflow uses RapidMiner extension named RMonto (http://semantic.cs.put.poznan.pl/RMonto/) to perform clustering of SPARQL query results based on chosen semantic similarity measure. The measure used in this particualr workflow is a kernel that exploits membership of clustered individuals to OWL classes from a background ontology ("Common classes" kernel from [1]). Since the semantics of the backgound ontology is used in this way, we use the name "semantic clustering". ...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 63 times | Downloaded: 17 times

Tags (8):

Original Uploader

Workflow Semantic clustering (with k-medoids) of SP... (1)

Created: 29/01/12 @ 09:28:33

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
The workflow uses RapidMiner extension named RMonto (http://semantic.cs.put.poznan.pl/RMonto/) to perform clustering of SPARQL query results based on chosen semantic similarity measure. Since the semantics of the backgound ontology is used in this way, we use the name "semantic clustering". The SPARQL query is entered in a parameter of "SPARQL selector" operator. The clustering operator (k-medoids) allows to specify which of the query variables are to be used as clustering criteria. If more ...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 34 times | Downloaded: 18 times

Tags (8):

Original Uploader

Workflow Semantic clustering (with alpha-clustering... (1)

Created: 29/01/12 @ 15:52:57 | Last updated: 30/01/12 @ 16:10:25

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
The workflow uses RapidMiner extension named RMonto (http://semantic.cs.put.poznan.pl/RMonto/) to perform clustering of SPARQL query results based on chosen semantic similarity measure. The measure used in this particualr workflow is a kernel that exploits membership of clustered individuals to OWL classes from a background ontology ("Epistemic" kernel from [1]). Since the semantics of the backgound ontology is used in this way, we use the name "semantic clustering". This ...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 31 times | Downloaded: 9 times

Tags (8):

Original Uploader

Workflow Loading OWL files (RDF version of videolec... (1)

Created: 29/01/12 @ 16:11:36 | Last updated: 29/01/12 @ 16:47:06

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
The workflow uses RapidMiner extension named RMonto (http://semantic.cs.put.poznan.pl/RMonto/). Operator "Build knowledge base" is responsible for collecting data either from OWL files or SPARQL endpoints or RDF repositories and provide it to the subsequent operators in a workflow. In this workflow it is parametrized in this way, that is builds a Sesame/OWLIM repository from the files specified in "Load file" operators. Paths to OWL files are specified as parameter va...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 45 times | Downloaded: 12 times

Tags (8):

Original Uploader

Workflow Operator testing workflow (1)

Created: 29/01/12 @ 11:08:43

Credits: User Matej Mihelčić User Matko Bošnjak

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow is used for operator testing. It joins dataset metafeatures with execution times and performanse measures of the selected recommendation operator. In the Extract train and Extract test Execute Process operator user should open Metafeature extraction workflow. In the Loop Operator train/test data are used to evaluate performanse of the selected operator. Result is remebered and joined with the time and metafeature informations. This workflow can be used both for Item Recommend...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 25 times | Downloaded: 7 times

Tags (4):

Original Uploader

Workflow Metafeature extraction (1)

Created: 29/01/12 @ 11:00:27 | Last updated: 30/01/12 @ 13:11:27

Credits: User Matko Bošnjak

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is a metafeature extraction workflow used in Experimentation workflow for recommender extension operators. This workflow extracts metadata from the train/test datasets (user/item counts, rating count, sparsity etc). This workflow is called from the operator testing workflow using Execute Process operator.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 28 times | Downloaded: 8 times

Tags (4):

Original Uploader

Workflow Model update workflow (1)

Created: 29/01/12 @ 16:51:32 | Last updated: 29/01/12 @ 16:54:22

Credits: User Matej Mihelčić User Matko Bošnjak

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is a Model update workflow called from data iteration workflow on every given query set. In the Loop operator model and current training set are retrieved from the repository. Model update is performed on a given query set creating new model. Model and updated train set are saved in the repository.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 15 times | Downloaded: 9 times

Tags (3):

Original Uploader

Workflow Data iteration workflow (1)

Created: 29/01/12 @ 16:47:38

Credits: User Matej Mihelčić User Matko Bošnjak

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is a data iteration workflow used to iterate throug query update sets.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 23 times | Downloaded: 9 times

Tags (3):

Original Uploader

Workflow Iterate through datasets (1)

Created: 29/01/12 @ 10:54:05

Credits: User Matej Mihelčić User Matko Bošnjak

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is a dataset iteration workflow. It is a part of Experimentation workflow for recommender extension. Loop FIles operator iterates through datasets from a specified directory using read aml operator. Only datasets specified with a proper regular expression are considered. Train and test data filenames must correspond e.g (train1.aml, test1.aml). In each iteration Loop Files calles specified operator testing workflow with Execute subprocess operator. Informations about training and t...

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 54 times | Downloaded: 53 times

Tags (4):

Original Uploader

Workflow Model testing workflow (1)

Created: 29/01/12 @ 16:45:55

Credits: User Matej Mihelčić

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow measures performance of three models. Model learned on train data and upgraded using online model updates. Model learned on train data + all query update sets. Model learned on train data only.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 31 times | Downloaded: 10 times

Tags (3):

Original Uploader

Workflow Model saving workflow (1)

Created: 29/01/12 @ 16:41:36 | Last updated: 30/01/12 @ 13:15:44

Credits: User Matej Mihelčić

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow trains and saves a model for a selected item recommendation operator.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 11 times | Downloaded: 5 times

Tags (3):

Original Uploader

Workflow Recommender workflow (1)

Created: 29/01/12 @ 16:38:56

Credits: User Matej Mihelčić

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is a main online update experimentation workflow. It consists of three Execute Process operators. First operator executes model training workflow. Second operator executes online updates workflow for multiple query update sets. The last operator executes performance testing and comparison workflow. Final performance results are saved in an Excel file.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 37 times | Downloaded: 10 times

Tags (3):

Original Uploader

Workflow Data iteration workflow (RP) (1)

Created: 29/01/12 @ 22:04:45

Credits: User Matej Mihelčić User Matko Bošnjak

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is a data iteration workflow used to iterate throug query update sets.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 20 times | Downloaded: 9 times

Tags (4):

Original Uploader

Workflow Model update workflow (RP) (1)

Created: 29/01/12 @ 22:02:06 | Last updated: 30/01/12 @ 13:01:22

Credits: User Matej Mihelčić

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is a Model update workflow called from data iteration workflow on every given query set. In the Loop operator model and current training set are retrieved from the repository. Model update is performed on a given query set creating new model. Model and updated train set are saved in the repository.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 16 times | Downloaded: 6 times

Tags (3):

Original Uploader

Workflow recommender workflow (RP) (1)

Created: 29/01/12 @ 21:58:28

Credits: User Matej Mihelčić

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This is a main online update experimentation workflow. It consists of three Execute Process operators. First operator executes model training workflow. Second operator executes online updates workflow for multiple query update sets. The last operator executes performance testing and comparison workflow. Final performance results are saved in an Excel file.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 29 times | Downloaded: 13 times

Tags (3):

Original Uploader

Workflow Model testing workflow (RP) (1)

Created: 29/01/12 @ 21:57:38 | Last updated: 30/01/12 @ 13:01:53

Credits: User Matej Mihelčić

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow measures performance of three models. Model learned on train data and upgraded using online model updates. Model learned on train data + all query update sets. Model learned on train data only.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 34 times | Downloaded: 4 times

Tags (4):

Original Uploader

Workflow Model saving workflow (RP) (1)

Created: 29/01/12 @ 21:56:33 | Last updated: 30/01/12 @ 13:03:04

Credits: User Matej Mihelčić

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow trains and saves model for a selected rating prediction operator.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 17 times | Downloaded: 3 times

Tags (4):

Original Uploader

Workflow Transforming user/item description dataset... (1)

Created: 30/01/12 @ 13:54:07

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
This workflow provides transformation of an user/item description attribute set, into a format required by attribute based k-NN operators of the Recommender extension. See: http://zel.irb.hr/wiki/lib/exe/fetch.php?media=del:projects:elico:recsys_manual_v1.1.pdf to learn about formats of datasets required by Recommender extension.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 30 times | Downloaded: 24 times

Tags (4):

Original Uploader

Workflow Semantic meta-mining workflow that perform... (1)

Created: 05/03/12 @ 21:37:20 | Last updated: 05/03/12 @ 21:37:21

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
Performs a crossvalidation on a data set composed of meta-data of baseline RapidMiner workflows expressed in RDF with the DMOP's ontology terminology for representing processes. Includes discovery of a set of semantic features (patterns) by the Fr-ONT-Qu algorithm (‘workflow patterns’). Through propositionalisation approach those features may be used in an arbitrary (propositional) RapidMiner classification operator.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 29 times | Downloaded: 9 times

Tags (5):

Original Uploader

Workflow Meta-mining workflow that performs crossva... (1)

Created: 05/03/12 @ 21:47:41 | Last updated: 05/03/12 @ 21:58:10

License: Creative Commons Attribution-No Derivative Works 3.0 Unported License

Thumb
Performs a crossvalidation on a data set composed of baseline RapidMiner workflows described with dataset characteristics used by the given workflow and the learning algorithm used in the given workflow.

Rating: 0.0 / 5 (0 ratings) | Versions: 1 | Reviews: 0 | Comments: 0 | Citations: 0

Viewed: 29 times | Downloaded: 14 times

Tags (4):

Packs (9)
Creator

Pack RapidMiner plugin for Taverna videos and descriptions


Created: 06/06/11 @ 10:17:52 | Last updated: 13/12/11 @ 16:02:04

 This pack contains videos the show how to use various parts of the RapidMiner plugin for Taverna. The videos demonstrate how to build a Taverna workflow which collects a GEO dataset, uploads it to RapidAnalytics, trains a classifier on one half of the data and tests it on the other half. This classification process can be used to gauge how well mutant and control assays agree across experimental repeats.

3 items in this pack

Comments: 0 | Viewed: 47 times | Downloaded: 26 times

Tags:

Creator

Pack Creating a focused corpus of factual outcomes from b...


Created: 28/06/11 @ 11:19:04 | Last updated: 13/12/11 @ 16:02:16

 This pack contains resources and supplementary files for the submission to the MIND2011 workshop titled "Creating a focused corpus of factual outcomes from biomedical experiments" by James Eales, George Demetriou and Robert Stevens

0 items in this pack

Comments: 0 | Viewed: 20 times | Downloaded: 11 times

Tags:

Creator

Pack RapidAnalytics Video Series Demo Processes


Created: 02/11/11 @ 15:02:21 | Last updated: 02/11/11 @ 18:00:41

This pack contains RapidMiner processes created for the RapidAnalytics Video Series.

1 item in this pack

Comments: 0 | Viewed: 52 times | Downloaded: 20 times

Tags:

Creator

Pack Who Wants to be a Data Miner?


Created: 02/11/11 @ 17:54:07 | Last updated: 04/09/12 @ 16:59:23

One of the most fun events at the annual RapidMiner Community Meeting and Conference (RCOMM) is the live data mining process design competition "Who Wants to be a Data Miner?". In this competition, participants must design RapidMiner processes for a given goal within a few minutes. The tasks are related to data mining and data analysis, but are rather uncommon. In fact, most of the challenges ask for things RapidMiner was never supposed to do. This pack contains solutions for these...

2 items in this pack

Comments: 0 | Viewed: 77 times | Downloaded: 10 times

Tags:

Creator

Pack e-LICO recommender workflows


Created: 15/03/11 @ 15:33:48 | Last updated: 28/01/12 @ 19:39:06

This pack contains recommender system workflows created for the purpose of e-LICO project.

0 items in this pack

Comments: 0 | Viewed: 98 times | Downloaded: 36 times

Tags:

Creator

Pack RCOMM2011 recommender systems workflow templates


Created: 07/04/11 @ 14:59:37 | Last updated: 28/01/12 @ 19:37:47

No description

0 items in this pack

Comments: 0 | Viewed: 188 times | Downloaded: 55 times

Tags:

Pack Online update experiment pack


Created: 29/01/12 @ 16:29:09 | Last updated: 29/01/12 @ 22:06:46

This is a pack containing experimentation workflows and datasets for item recommendation and rating prediction online update testing.

0 items in this pack

Comments: 0 | Viewed: 23 times | Downloaded: 12 times

Tags:

Pack Experimentation for recommender extension templates


Created: 28/01/12 @ 21:54:16 | Last updated: 31/01/12 @ 16:01:43

This is a recommender extension experimentation pack

0 items in this pack

Comments: 0 | Viewed: 21 times | Downloaded: 14 times

Tags:

Creator

Pack Sudoku solving with RapidMiner (Who Wants to be a Da...


Created: 04/09/12 @ 16:56:44 | Last updated: 04/09/12 @ 16:58:26

A fun event at the annual RapidMiner conference RCOMM is the live data mining challenge "Who wants to be a data miner?" where contestants solve tasks data analysis tasks within a few minutes. In 2012 the task was to (partially) solve a Sudoku puzzle. Processes 1 to 3 in this pack correspond to the three tasks whereas process 0 loads the initial data and task 4 is a bonus process that solves the entire Sudoku. Make sure the processes are saved under the name they have on myExperimen...

0 items in this pack

Comments: 0 | Viewed: 8 times | Downloaded: 15 times

Tags:

What is this?

Linked Data

Non-Information Resource URI: http://alpha.myexperiment.org/tags/1764


Alternative Formats

HTML
RDF
XML

New/Upload

Log in / Register

Username or Email:

Password:

Remember me:

OR

Use OpenID:


(eg: name.myopenid.com)

Need an account?
Click here to register

Forgot Password?

Front Page

Home

Invite people to myExperiment Alpha

Help pages

About Us

News and Events

Mailing List

Contact Us

Developers

Publications


Taverna Workflow Workbench

myGrid

BioCatalogue

Trident

Google Coop Search

EPSRC

JISC

Microsoft

Powered by:

Rails

Icons:
Silk icon set 1.3