In this example we illustrate the implementation and composition of two services into a decentralized application with Aqua. Specifically, we use a hosted greeting, aka *hello world*, service as a consumer of the output of another hosted service with Aqua. For the purpose of this example let's call our upstream service *echo-service*, which simply echos inputs. *Echo-service* can be viewed as a placeholder for, say, a database or formatting service.
To run the example in its entirety, you need to install a few tools. See [Setting Up](https://doc.fluence.dev/docs/tutorials_tutorials/recipes_setting_up) for details. For more developer resources see the [Developer Docs](https://doc.fluence.dev/docs/), [Aqua Book](https://doc.fluence.dev/aqua-book/) and the [Marine Examples](./../../marine-examples).
Services are logical constructs comprised of Wasm Interface Types (IT) modules executing on the [Marine](https://github.com/fluencelabs/marine) runtime available on each [Fluence node](https://github.com/fluencelabs/fluence). At this time, Rust is not only the preferred but also the only option to write Wasm modules. For the examples at hand, we need to develop and deploy two services: a greeting service and an echo service where the echo service returns the inputs for the greeting service.
Our [greeting service](./greeting/src/main.rs) is very simple: it takes a name value to return and a boolean value to determine whether our greeting to `name` is *Hi* or *Bye*. As shown below, the code is basic Rust with plus the `marine macro`, which makes sure our code is valid Wasm IT code that can be compiled to our desired `wasmer32-wasi` compile target.
We can compile our code with the provided build script:
```text
% ./scripts/build_all.sh
```
The build script compiles each of the specified services with the marine compiler and generates two Wasm modules, which are placed in the `artifacts` directory. Before we deploy the service, we can inspect and test each module with the Marine REPL and the `configs/Config.toml` file which contains the metadata with respect to module location, name, etc.
```text
% mrepl configs/Config.toml
Welcome to the Marine REPL (version 0.8.0)
Minimal supported versions
sdk: 0.6.0
interface-types: 0.20.0
app service was created with service id = d5974dab-d7dc-4168-9b47-1d9a647a6fa8
Looks like all is working as planned and we're ready to deploy our services to the Fluence testnet. To deploy a service, we need the peer id of our desired host node, which we can get with `fldist env`:
Any one of the peers will do and we can deploy our services with the `fldist` tool by providing the peer id of the host node and the location of the Wasm module(s) and configuration file defining the service.
## Building A Decentralized Greeting Application With Aqua
We're ready to build our application with Aqua as our composition medium from the greeting and echo service. Creating Aqua scripts requires the specifications of each service's public API. Marine offers us a convenient way to export Aqua-compatible interface definitions:
Of course, we can pipe the `marina aqua` interfaces into an aqua file of your choice, e.g. `marine aqua artifacts/greeting.wasm >> aqua-scripts/my_aqua.aqua`, to get things started. Before we dive into the Aqua development, let's compile the already created Aqua program `aqua-scripts\echo_greeter.aqua` with `aqua`:
Since we compile with the `-a` flag, we generate aqua intermediate representation (AIR) files which are located in the `air-scripts` directory. Further below, we'll see how to generate ready-to use Typescript stubs generated by the Aqua compiler.
To make things copacetic for the remainder of this section, we'll be using services already deployed to the Fluence testnet:
Below is the first attempt at using Aqua to compose our two services into the desired application workflow: the execution of a greeting service for each output provided by the upstream echo service.
The first section of the Aqua file are the public interfaces exposed from the underlying Wasm services, which we obtained earlier. Our composition of the services into our application happens with the `echo_greeting_seq` function. Before we run through the function body, let's have a look at the function signature:
Our first two argument slots in `echo_greeting_seq` take care of that. Aside from the actual Wasm function inputs, we also need to provide information with respect to the location and identity of the services we want to utilize. In this instance, we provide service ids for both the echo and greeting service, respectively, and one peer id. This indicates that both services are hosted on the same node, which is possible but not necessary or even desirable.
`fldist` provides a client peer and deploys the compiled Aqua script, with the `-p` flag, and input data, with the `-d` flag, in form of a json string to the peer-to-peer network for execution and returns expected result:
Of course, services need not be deployed to the same node and with some minor adjustments to our Aqua function signature and body, we can accommodate multi-host scenarios rather easily. We also added the `NodeServicePair` structure to make the function signature more compact:
```aqua
-- aqua-scripts/echo_greeter.aqua
-- struct for node, service tuple
data NodeServicePair:
node: string
service_id: string
-- revised Aqua function to accommodate (node, service) separation
Since we want to compose services deployed on different nodes, we express this requirement by specifying the (node, service) tuples via `on echo_topo.node` and `on greeting-topo.node` in sequence. That is, the workflow first calls the echo-service followed by three sequential calls on the greeting service.
Both workflow examples we've seen are **seq**uentially executing service calls. Let's kick it up a notch and process echo service outputs in **par**allel. Of course, we need to have the necessary greeting services deployed on different peers otherwise parallel processing defaults to sequential processing. Also, to continue to keep things compact, we introduce the `EchoServiceInput` struct.
In this implementation version, we call the echo-service, just as before, and introduce parallelization when we reach the greeting service fold. That is, each greeting service arm is run in parallel and for each *result*, we execute k greeting services, as specified in greeting_services array, in parallel. Note that as a consequence of the parallelization we need to introduce a `join` on *res* as the result streaming into *res* happens on the specified node and therefore without being visible to the other streaming activities. We accomplish this with the `OpString.identity(res!5)` function where the argument needs to be a literal at this point.
Since we got three input names and two greeting services, we expect, and got, six results where the parallelization is on each echo-service result. Of course, we can change the point of parallelization to cover the echo-service results array for each provided service. Our updated Auqa composition function now reads:
With some additional modifications to our Aqua function, we can further improve readability by supplying the *greet* parameter for each service. Let's add a `GreetingServiceInput` struct and update the function signatures and bodies:
In this section, we explored how we can use Aqua to program hosted services into applications. Along the way, we investigated sequential and parallel workflows and discovered that changes in processing or workflow logic are taken care of at the Aqua level not requiring any changes to the deployed services. Throughout our experimentation with Aqua and deployed services, we used the `fldist` tool as our local cli client peer. In the next section, we introduce the development and use of a Typescript client peer.
In the previous section we used `fldist` as our local peer client to run the execution of our compiled Aqua scripts on the network. Alternatively, Aqua code can be directly compiled to Typescript utilizing the Fluence [JS-SDK](https://github.com/fluencelabs/fluence-js).
The ensuing, auto-generated file is called `echo_greeter.ts` and was copied to the `src` directory. The Aqua compiler auto-generated the Typescript functions corresponding to each of the Aqua functions we implemented. All we have to do is use them!