Compare commits

...

65 Commits

Author SHA1 Message Date
Bram 086e491b06 Ignore decoder logs
Specifically on Hashdict/Mashdict as they variate based on JSON input
2024-04-12 14:30:33 +02:00
Bram 0ef298a28e Connect User to Event type 2024-04-12 13:57:38 +02:00
Bram 259f695b74 Expose Matrix.User module 2024-04-12 13:15:26 +02:00
Bram ae19884a18 Correct IPv6 Parser 2024-04-12 13:14:50 +02:00
Bram b3e103a5d9 Write Grammar tests 2024-04-10 18:14:32 +02:00
Bram 203205f53c Expose Elm types
As a general rule of thumb, internal values need no opaque types to hide their implementation
2024-04-10 08:28:52 +02:00
Bram c858ef151e Merge develop into parser branch 2024-04-09 20:46:10 +02:00
Bram 78794ef59b Add exposed modules + elm-format 2024-04-09 20:43:18 +02:00
Bram cd0c687307 Add server name module 2024-04-09 20:38:35 +02:00
Bram 077605bcbc Remove old file locations 2024-04-09 20:37:01 +02:00
BramvdnHeuvel 709d608056
Merge pull request #17 from noordstar/3-timeline
Add Timeline as data type
2024-04-08 14:21:04 +02:00
Bram f1dde4874b Merge Text code 2024-03-29 12:01:05 +01:00
Bram 5856084b45 Remove unused RationalOrder module
Kill your darlings, I guess. :(
2024-03-29 11:53:24 +01:00
Bram db6573180b elm-format 2024-03-29 11:52:12 +01:00
Bram 3983ab0de2 Merge develop into 3-timeline
Fixing merge conflict
2024-03-29 11:29:48 +01:00
Bram ccefa2ed9b Add documentation 2024-03-29 11:28:04 +01:00
Bram d41c31e8c1 Allow Timeline fuzzers to simulate /sync 2024-03-29 11:18:01 +01:00
Bram 792e60761a Add addSync to Timeline 2024-03-29 11:08:42 +01:00
Bram 33d98dd6ff Improve code readability
Cleaning up unused imports + changing order of operations on a value that is 0 by default
2024-03-29 10:51:10 +01:00
Bram ed78695213 Add Timeline JSON coder 2024-03-29 10:46:02 +01:00
Bram 791ada6f64 Merge branch 'parser' of github.com:noordstar/elm-matrix-sdk-beta into parser 2024-03-29 07:18:53 +01:00
Bram 56c978bcf3 Add custom text 2024-03-29 07:16:36 +01:00
Bram d1336a0e23 Add server name + user id parsers 2024-03-29 07:15:27 +01:00
Bram d68de7f2fb Add parser helper functions 2024-03-29 07:13:06 +01:00
Bram 43f0ac5ef2 Add server name specified fuzzers 2024-03-25 10:07:47 +01:00
Bram 421e1f6ce7 Add Elm Timeline to elm.json 2024-03-25 08:52:55 +01:00
Bram 1940b1d51f Fix RationalOrder formatting 2024-02-15 13:20:25 +01:00
Bram 7acae258ed Complete Timeline by fixing bugs 2024-02-15 13:15:19 +01:00
Bram 10c7075bef Merge branch '3-timeline' of github.com:noordstar/elm-matrix-sdk-beta into 3-timeline 2024-02-15 11:34:25 +01:00
Bram van den Heuvel cf28a3f210 Implement `mostRecentEvents` function 2024-02-15 01:27:00 +01:00
Bram van den Heuvel 2d26e1826d Add challenge 2 2024-02-13 11:13:16 +01:00
Bram van den Heuvel 6134702d25 Add Timeline documentation 2024-02-12 18:54:58 +01:00
Bram cd8163bb41 Fix bugs 2024-02-02 12:57:38 +01:00
Bram bbe1eeef12 Prepare develop for master
elm-test --fuzz 10000 --seed 49678983951728
2024-02-02 12:15:50 +01:00
Bram 016290d9e1 Merge branch 'develop' into 3-timeline 2024-01-29 22:09:59 +01:00
BramvdnHeuvel 16a7e7e66b
Merge pull request #15 from noordstar/json-extra
Json reintroduction
2024-01-29 21:52:31 +01:00
Bram d40af28b38 Add RationalOrder type 2024-01-29 21:42:33 +01:00
Bram van den Heuvel ead65c07f5 Write object field documentation 2024-01-25 14:47:54 +01:00
Bram b6e181237f Merge branch 'develop' into 3-timeline 2024-01-23 19:20:44 +01:00
Bram c3c08845d8 Merge branch 'develop' into json-extra 2024-01-23 13:43:05 +01:00
Bram acd13ac67a Complete documentation 2024-01-22 18:09:08 +01:00
Bram dd5f298fd3 Migrate remaining objects to new JSON coders 2024-01-22 17:44:22 +01:00
Bram d1fbc87730 Refactor to new JSON coders 2024-01-19 16:22:51 +01:00
Bram 28d2a17a10 Add final features to JSON coder module 2024-01-19 16:21:41 +01:00
Bram 3f08e4a3e7 Add more high-level ary humans 2024-01-18 14:46:50 +01:00
Bram ecdc136f9e Write JSON test module 2024-01-16 16:06:38 +01:00
Bram 21dfa1e77f Fix documentation standards 2024-01-16 16:05:34 +01:00
Bram van den Heuvel 6f42916a19 Add advanced JSON module 2024-01-16 14:13:27 +01:00
BramvdnHeuvel ef9b007730
Merge origin/develop-to-main to origin/develop
Develop to main for beta 2.1.1
2024-01-12 17:22:32 +01:00
Bram van den Heuvel 311de94c1f Bump to beta 2.1.1 2024-01-12 17:18:01 +01:00
Bram van den Heuvel 462aa3a2dc Fix test 2024-01-12 17:15:07 +01:00
BramvdnHeuvel cecf9c1f77
Merge pull request #11 from noordstar/settings-patch
Update syncTime to be at least 1
2024-01-12 17:12:24 +01:00
Bram van den Heuvel ad3f45d035 Update syncTime to be at least 1 2024-01-12 17:11:07 +01:00
Bram 2f7a247dbd Rewrite Timeline tests for comparing timelines through JSON 2024-01-12 00:08:35 +01:00
Bram 394799da8b Add Timeline JSON coders 2024-01-12 00:08:06 +01:00
Bram van den Heuvel 29f6a5e754 Refactor Batch input 2024-01-11 22:35:14 +01:00
Bram van den Heuvel 79aff7af3b Refactor Timeline architecture 2024-01-11 09:20:16 +01:00
Bram van den Heuvel fd569aa476 Implement complex JSON decoder 2024-01-10 09:22:12 +01:00
Bram van den Heuvel ae38fe6878 Introduce phantom file 2024-01-10 00:59:22 +01:00
Bram van den Heuvel 06c048286c Add log module 2024-01-10 00:59:09 +01:00
Bram van den Heuvel 2d01802b86 Rename Tools.Encode and Tools.Decode to Extra 2024-01-09 18:35:21 +01:00
Bram van den Heuvel 81b0b1c166 Add first (faulty) design of Timeline 2024-01-09 18:19:29 +01:00
Bram 211f8f1df4 Add Timeline framework
The tests fail, but using test-driven development we will now build them functional
2024-01-05 13:51:06 +01:00
Bram e8ee125def Add subsetOf Filter function 2024-01-04 03:13:17 +01:00
Bram 3739043f87 Create Timeline filter 2024-01-04 02:00:31 +01:00
45 changed files with 6209 additions and 401 deletions

138
docs/timeline.md Normal file
View File

@ -0,0 +1,138 @@
# Timeline
Given the complex nature of the Timeline design, it deserves some explanation of
the design. This document aims to describe how the Elm SDK designs the Timeline,
so that other projects may learn from it.
## API endpoint disambiguations
Generally speaking, there are a few API endpoints with similar design:
- The [`/sync` endpoint](https://spec.matrix.org/v1.9/client-server-api/#get_matrixclientv3sync),
which gets the events that the homeserver received most recently.
- The [`/messages` endpoint](https://spec.matrix.org/v1.9/client-server-api/#get_matrixclientv3roomsroomidmembers),
which gets any events in the topological order.
As noted in the Matrix spec:
> Events are ordered in this API according to the arrival time of the event on
> the homeserver. This can conflict with other APIs which order events based on
> their partial ordering in the event graph. This can result in duplicate events
> being received (once per distinct API called). Clients SHOULD de-duplicate
> events based on the event ID when this happens.
For this reason, the Elm SDK maintains **two independent timelines** that are tied
together when necessary to form a coherent timeline.
## Elm design
For those unfamiliar, the Elm Architecture breaks into three parts:
- **Model** - the state of the application
- **View** - a way to turn your state into meaningful information
- **Update** - a way to update your state based on the Matrix API
Since these concepts are compartmentalized, it is impossible to make an API call
while executing the **view** function; the Elm SDK must at all times find a way
to represent its state.
## Timeline
Concerning the Matrix timeline, it is meant to create a representation
(**Model**) of the timeline, find a way to represent (**View**) it, and find a
simple way to adjust it with every incoming Matrix API result. (**Update**)
First, we define what a timeline batch is.
### Timeline batch
A timeline batch is something that most Matrix API endpoints return. It is a
little piece of the timeline and contains the following four pieces of
information:
1. A list of events that are part of the timeline.
2. A Filter for which all provided events meet the criteria.
3. An end batch token that functions as an identifier.
4. _(Optional.)_ A start token. If not provided, it indicates the start of the
timeline.
Here's an example of such a timeline batch:
```
|-->[■]->[■]->[●]->[■]->[■]->[●]-->|
| |
|<--- filter: only and --->|
| |
start: end:
<token_1> <token_2>
```
When the Matrix API later returns a batch token that starts with `<token_2>`,
we know that we can connect it to the batch above and make a longer list of
events!
At first, this seems quite simple to connect, but there are some difficulties
that come up along the way.
### Challenge 1: different filters, different locations
When two timeline batches have different filters, we do not know their
respective location. For example, the following two timeline batches COULD
overlap, but it is also possible they don't:
```
|-->[■]->[■]->[●]->[■]->[■]->[●]-->|
| |
|<--- filter: only and --->|
| |
start: end:
<token_1> <token_2>
|-->[★]->[★]->[★]->[★]-->|
| |
|<-- filter: only -->|
| |
start: end:
<token_3> <token_4>
```
Realistically, there is currently no way of knowing without making more API
calls. However, just making more API calls isn't a solution in Elm because of
its architecture.
> **SOLUTION:** As described in the **View** function, we may assume that
overlapping timeline batches have overlapping events. If they overlap yet have
no overlapping events, then their filters must be disjoint. If the filters are
disjoint, we do not care whether they're overlapping.
### Challenge 2: same filters, same spot
Suppose there is a known timeline batch, and we're trying to **Update** the
timeline to represent the timeline between `<token_1>` and `<token_2>` for a
different filter:
```
|-->[■]->[■]->[●]->[■]->[■]->[●]-->|
| |
|<--- filter: only and --->|
| |
start: end:
<token_1> <token_2>
```
If we wish to know what's in there for a different filter `f`, then:
1. If `f` equals the filter from the timeline batch, we can copy the events.
2. If `f` is a subfilter of the batch filter (for example: `only ■`) then we can
copy the events from the given batch, and then locally filter the events
that do no match filter `f`.
3. If the batch filter is a subfilter of `f`, then we can use an API call
between the same batch tokens `<token_1>` and `<token_2>`. In the worst
case, we receive the exact same list of events. In another scenario, we
might discover far more events and receive some new batch value `<token_3>`
in-between `<token_1>` and `<token_2>`.
4. If neither filter is a subfilter of the other and the two are (at least
partially) disjoint, then they do not need to correlate and any other batch
values can be chosen.

View File

@ -3,11 +3,37 @@
"name": "noordstar/elm-matrix-sdk-beta",
"summary": "Matrix SDK for instant communication. Unstable beta version for testing only.",
"license": "EUPL-1.1",
"version": "2.1.0",
"version": "2.1.2",
"exposed-modules": [
"Internal.Config.Default",
"Internal.Config.Leaks",
"Internal.Config.Log",
"Internal.Config.Phantom",
"Internal.Config.Text",
"Internal.Filter.Timeline",
"Internal.Grammar.ServerName",
"Internal.Grammar.UserId",
"Internal.Tools.DecodeExtra",
"Internal.Tools.EncodeExtra",
"Internal.Tools.Hashdict",
"Internal.Tools.Iddict",
"Internal.Tools.Json",
"Internal.Tools.Mashdict",
"Internal.Tools.Timestamp",
"Internal.Tools.VersionControl",
"Internal.Values.Context",
"Internal.Values.Envelope",
"Internal.Values.Event",
"Internal.Values.Settings",
"Internal.Values.StateManager",
"Internal.Values.Timeline",
"Internal.Values.User",
"Internal.Values.Vault",
"Matrix",
"Matrix.Event",
"Matrix.Settings"
"Matrix.Settings",
"Matrix.User",
"Types"
],
"elm-version": "0.19.0 <= v < 0.20.0",
"dependencies": {
@ -15,6 +41,7 @@
"elm/json": "1.0.0 <= v < 2.0.0",
"elm/parser": "1.0.0 <= v < 2.0.0",
"elm/time": "1.0.0 <= v < 2.0.0",
"micahhahn/elm-safe-recursion": "2.0.0 <= v < 3.0.0",
"miniBill/elm-fast-dict": "1.0.0 <= v < 2.0.0"
},
"test-dependencies": {

View File

@ -23,7 +23,7 @@ will assume until overriden by the user.
-}
currentVersion : String
currentVersion =
"beta 2.1.0"
"beta 2.1.2"
{-| The default device name that is being communicated with the Matrix API.

View File

@ -1,4 +1,7 @@
module Internal.Config.Leaks exposing (accessToken, baseUrl, transaction, versions)
module Internal.Config.Leaks exposing
( accessToken, baseUrl, transaction, versions
, allLeaks
)
{-|
@ -29,8 +32,14 @@ know 100% sure that the value isn't `Nothing`.
@docs accessToken, baseUrl, transaction, versions
For safety purposes, all leaking values are stored in the following value:
@docs allLeaks
-}
import Set exposing (Set)
{-| Placeholder access token.
-}
@ -39,6 +48,20 @@ accessToken =
"elm-sdk-placeholder-access-token-leaks"
{-| Complete set of all leaking values. Commonly using for testing purposes.
-}
allLeaks : Set String
allLeaks =
Set.union
(Set.fromList versions)
(Set.fromList
[ accessToken
, baseUrl
, transaction
]
)
{-| Placeholder base URL.
-}
baseUrl : String

105
src/Internal/Config/Log.elm Normal file
View File

@ -0,0 +1,105 @@
module Internal.Config.Log exposing (Log, log)
{-|
# Logs
The logs module exposes various log types that can be used to indicate logs.
This helps users filter for the logs that they care about.
@docs Log, log
The logs are encoded as strings as to allow the addition of new log types
without triggering a major update.
-}
-- @docs caughtError, debug, error, info, securityWarn, warn
{-| Common pattern for a log message. The log message consists of a log channel
like `debug`, `warn`, `error`, etc. and the content of the message.
These logs are completely optional: they can be ignored, they can be sent to the
console, or a dialog may be created that presents the log messages.
-}
type alias Log =
{ channel : String, content : String }
{-| Create a log message of various log types.
-}
log :
{ caughtError : String -> Log
, debug : String -> Log
, error : String -> Log
, info : String -> Log
, securityWarn : String -> Log
, warn : String -> Log
}
log =
{ caughtError = Log caughtError
, debug = Log debug
, error = Log error
, info = Log info
, securityWarn = Log securityWarn
, warn = Log warn
}
{-| A caught error is an error that has been caught elsewhere in the code, hence
functioning as a secondary debug channel.
-}
caughtError : String
caughtError =
"caught-error"
{-| Debug logs are logs that can be used to debug API interactions.
-}
debug : String
debug =
"debug"
{-| Error strings indicate that something unexpected has happened. As a result,
something has stopped working.
-}
error : String
error =
"error"
{-| Info contains relevant info for the user
-}
info : String
info =
"info"
{-| Security warnings are warnings that contain red flags.
Of course, the Elm SDK is not aware of any security vulnerabilities that it
contains, but it can raise a user's attention to suspicious situations.
For example, if the homeserver returns room ids that do not look like usernames
at all, the homeserver can raise a security warning, which indicates that:
1. The homeserver might be bugged
2. The Elm SDK might be severaly outdated
3. The homeserver might be compromised and/or trying to attack the Elm SDK
-}
securityWarn : String
securityWarn =
"security-warn"
{-| Warning logs are logs that are unusual, but that can be dealt with. Warnings
are debug logs that are out of the ordinary.
-}
warn : String
warn =
"warn"

View File

@ -0,0 +1,51 @@
module Internal.Config.Phantom exposing (PString(..), PInt(..), PBool(..), PList(..))
{-|
# Phantom types
This module contains a lot of phantom types that do not necessarily do anything,
but they force the compiler to create an error whenever something illegal is
done.
Compiler errors may seem annoying, they can help you write good code. In a
functional programming language like Elm, the trick is to design your code in
such a way that if it compiles, it works. Phantom types can help you do so.
The phantom types in this module help you in the following way:
1. They can help force an compile to fault when you forget to run a function.
2. They can help track values for security.
## Standard data types
@docs PString, PInt, PBool, PList
-}
{-| Opaque type that encapsulates a bool.
-}
type PBool ph
= PBool Bool
{-| Opaque type that encapsulates an int.
-}
type PInt ph
= PInt Int
{-| Opaque type that encapsulates a list.
-}
type PList ph a
= PList (List a)
{-| Opaque type that encapsulates a string.
-}
type PString ph
= PString String

View File

@ -1,7 +1,9 @@
module Internal.Config.Text exposing
( versionsFoundLocally, versionsReceived, versionsFailedToDecode
( docs, failures, fields, mappings, logs, parses
, accessTokenFoundLocally, accessTokenExpired, accessTokenInvalid
, versionsFoundLocally, versionsReceived, versionsFailedToDecode
, unsupportedVersionForEndpoint
, decodedDictSize, invalidHashInHashdict, invalidHashInMashdict, leakingValueFound
)
{-| Throughout the Elm SDK, there are lots of pieces of text being used for
@ -23,12 +25,9 @@ This is a risky feature, keep in mind that even a patch update might break this!
You should only do this if you know what you're doing.
## API Versions
## Type documentation
Messages sent as API logs while the Elm SDK is figuring out how modern the
homeserver is and how it can best communicate.
@docs versionsFoundLocally, versionsReceived, versionsFailedToDecode
@docs docs, failures, fields, mappings, logs, parses
## API Authentication
@ -41,15 +40,38 @@ interaction.
offers room for translation, re-wording and refactors.
## API Versions
Messages sent as API logs while the Elm SDK is figuring out how modern the
homeserver is and how it can best communicate.
@docs versionsFoundLocally, versionsReceived, versionsFailedToDecode
## API miscellaneous messages
Messages sent as API logs during communication with the API.
@docs unsupportedVersionForEndpoint
## JSON decoder
Messages sent as API logs when a JSON value is being decoded.
@docs decodedDictSize, invalidHashInHashdict, invalidHashInMashdict, leakingValueFound
-}
type alias Desc =
List String
type alias TypeDocs =
{ name : String, description : Desc }
{-| Logs when the Matrix API returns that an access token is no longer valid.
-}
accessTokenExpired : String
@ -73,6 +95,448 @@ accessTokenInvalid =
"Matrix API rejected access token as invalid"
{-| Logs when the JSON decoder detects that an imported dictionary contained
duplicate keys.
-}
decodedDictSize : Int -> Int -> String
decodedDictSize from to =
String.concat
[ "JSON dict contained duplicate keys (JSON had "
, String.fromInt from
, " keys, Elm dict has "
, String.fromInt to
, " keys)"
]
{-| Documentation used for all functions and data types in JSON coders
-}
docs :
{ context : TypeDocs
, envelope : TypeDocs
, event : TypeDocs
, hashdict : TypeDocs
, ibatch : TypeDocs
, iddict : TypeDocs
, itoken : TypeDocs
, mashdict : TypeDocs
, settings : TypeDocs
, stateManager : TypeDocs
, timeline : TypeDocs
, timelineFilter : TypeDocs
, unsigned : TypeDocs
}
docs =
{ context =
{ name = "Context"
, description =
[ "The Context is the set of variables that the user (mostly) cannot control."
, "The Context contains tokens, values and other bits that the Vault receives from the Matrix API."
]
}
, envelope =
{ name = "Envelope"
, description =
[ "The Envelope module wraps existing data types with lots of values and settings that can be adjusted manually."
]
}
, event =
{ name = "Event"
, description =
[ "The Event type represents a single value that contains all the information for a single event in the room."
]
}
, hashdict =
{ name = "Hashdict"
, description =
[ "This allows you to store values based on an externally defined identifier."
, "For example, the hashdict can store events and use their event id as their key."
]
}
, ibatch =
{ name = "IBatch"
, description =
[ "The internal batch tracks a patch of events on the Matrix timeline."
]
}
, iddict =
{ name = "Iddict"
, description =
[ "An iddict automatically handles creating appropriate keys by incrementally assiging a new key to new values."
]
}
, itoken =
{ name = "IToken"
, description =
[ "The IToken connects batches in the timeline and maintains relative order."
]
}
, mashdict =
{ name = "Mashdict"
, description =
[ "The mashdict exclusively stores values for which the hashing algorithm returns a value, and it ignores the outcome for all other scenarios."
]
}
, settings =
{ name = "Settings"
, description =
[ "The settings type is a data type to configure settings in the enveloped data type."
]
}
, stateManager =
{ name = "StateManager"
, description =
[ "The StateManager tracks the room state based on events, their event types and the optional state keys they provide."
, "Instead of making the user loop through the room's timeline of events, the StateManager offers the user a dictionary-like experience to navigate through the Matrix room state."
]
}
, timeline =
{ name = "Timeline"
, description =
[ "The Timeline tracks events and orders them in a simple way for the user to view them."
]
}
, timelineFilter =
{ name = "Timeline Filter"
, description =
[ "The Timeline Filter allows the user to be very specific about which events they're interested in."
]
}
, unsigned =
{ name = "Unsigned Data"
, description =
[ "Unsigned data is optional data that might come along with the event."
, "This information is often supportive but not necessary to the context."
]
}
}
{-| Description of all edge cases where a JSON decoder can fail.
-}
failures : { hashdict : Desc, listWithOne : String, mashdict : Desc }
failures =
{ hashdict =
[ "Not all values map to their respected hash with the given hash function."
]
, listWithOne = "Expected at least one value in the list - zero found."
, mashdict =
[ "Not all values map to their respected hash with the given hash function."
]
}
{-| Objects contain multiple fields. These fields are here described, explaining
what they do and what they are for.
-}
fields :
{ context :
{ accessToken : Desc
, baseUrl : Desc
, password : Desc
, refreshToken : Desc
, username : Desc
, transaction : Desc
, versions : Desc
}
, envelope :
{ content : Desc
, context : Desc
, settings : Desc
}
, event :
{ content : Desc
, eventId : Desc
, originServerTs : Desc
, roomId : Desc
, sender : Desc
, stateKey : Desc
, eventType : Desc
, unsigned : Desc
}
, ibatch :
{ end : Desc
, events : Desc
, filter : Desc
, start : Desc
}
, iddict :
{ cursor : Desc
, dict : Desc
}
, itoken :
{ behind : Desc
, ends : Desc
, inFrontOf : Desc
, name : Desc
, starts : Desc
}
, settings :
{ currentVersion : Desc
, deviceName : Desc
, syncTime : Desc
}
, timeline :
{ batches : Desc
, events : Desc
, filledBatches : Desc
, mostRecentBatch : Desc
, tokens : Desc
}
, timelineFilter :
{ senders : Desc
, sendersAllowOthers : Desc
, types : Desc
, typesAllowOthers : Desc
}
, unsigned :
{ age : Desc
, prevContent : Desc
, redactedBecause : Desc
, transactionId : Desc
}
}
fields =
{ context =
{ accessToken =
[ "The access token used for authentication with the Matrix server."
]
, baseUrl =
[ "The base URL of the Matrix server."
]
, password =
[ "The user's password for authentication purposes."
]
, refreshToken =
[ "The token used to obtain a new access token upon expiration of the current access token."
]
, username =
[ "The username of the Matrix account."
]
, transaction =
[ "A unique identifier for a transaction initiated by the user."
]
, versions =
[ "The versions of the Matrix protocol that are supported by the server."
]
}
, envelope =
{ content =
[ "The actual data or payload that is wrapped within the envelope."
]
, context =
[ "The context information associated with the envelope, such as environment or session details."
, "In general, this data cannot be directly configured by the user."
]
, settings =
[ "The configurable settings that affect how the enveloped data is handled or processed."
]
}
, event =
{ content =
[ "The body of this event, as created by the client which sent it."
]
, eventId =
[ "The globally unique identifier for this event."
]
, originServerTs =
[ "Timestamp (in milliseconds since the unix epoch) on originating homeserver when this event was sent."
]
, roomId =
[ "The ID of the room associated with this event."
]
, sender =
[ "Contains the fully-qualified ID of the user who sent this event."
]
, stateKey =
[ "Present if, and only if, this event is a state event. The key making this piece of state unique in the room. Note that it is often an empty string."
, "State keys starting with an @ are reserved for referencing user IDs, such as room members. With the exception of a few events, state events set with a given users ID as the state key MUST only be set by that user."
]
, eventType =
[ "The type of the event."
]
, unsigned =
[ "Contains optional extra information about the event."
]
}
, ibatch =
{ end =
[ "Pointer to the token that ends the internal batch."
]
, events =
[ "List of event IDs contained within the internal batch."
]
, filter =
[ "Filter that indicates how strictly the homeserver has selected when resulting into the given list of events."
]
, start =
[ "Pointer to the token that starts the internal batch."
]
}
, iddict =
{ cursor =
[ "To ensure uniqueness of all keys and to prevent the usage of keys that were previously assigned to older values, the iddict tracks which is the smallest non-negative integer that hasn't been used yet."
]
, dict =
[ "Dictionary that contains all values stored in the iddict."
]
}
, itoken =
{ behind =
[ "This token is behind all tokens in this field."
]
, ends =
[ "This token is in front of the batches in this field."
]
, inFrontOf =
[ "This token is ahead of all tokens in this field."
]
, name =
[ "Opaque value provided by the homeserver."
]
, starts =
[ "This token is at the start of the batches in this field."
]
}
, settings =
{ currentVersion =
[ "Indicates the current version of the Elm SDK."
]
, deviceName =
[ "Indicates the device name that is communicated to the Matrix API."
]
, syncTime =
[ "Indicates the frequency in miliseconds with which the Elm SDK should long-poll the /sync endpoint."
]
}
, timeline =
{ batches =
[ "Dictionary storing all event batches in the timeline."
]
, events =
[ "Mapping that allows us to quickly zoom in on an event."
]
, filledBatches =
[ "Counter that tracks how many batches are kept by the timeline."
, "Batches are only counted if they are filled by at least one event."
]
, mostRecentBatch =
[ "Tracks the most recent batch that was sent by the homeserver - usually through `/sync`"
]
, tokens =
[ "Index of all the tokens used to connect event batches on the timeline."
]
}
, timelineFilter =
{ senders =
[ "A list of senders that is considered an exception to the infinite pool of \"other\" users"
]
, sendersAllowOthers =
[ "Value that determines whether the infinite pool of others is included."
, "If False, only the users mentioned in `senders` are included. If True, then all users who aren't mentioned in `senders` are included."
]
, types =
[ "A list of event types that is considered an exception to the infinite pool of \"other\" event types."
]
, typesAllowOthers =
[ "Value that determines whether the infinite pool of others is included."
, "If False, only the event types mentioned in `types` are included. If True, then all users who aren't mentioned in `types` are included."
]
}
, unsigned =
{ age =
[ "The time in milliseconds that has elapsed since the event was sent. This field is generated by the local homeserver, and may be incorrect if the local time on at least one of the two servers is out of sync, which can cause the age to either be negative or greater than it actually is."
]
, prevContent =
[ "The previous content for this event. This field is generated by the local homeserver, and is only returned if the event is a state event, and the client has permission to see the previous content."
]
, redactedBecause =
[ "The event that redacted this event, if any."
]
, transactionId =
[ "The client-supplied transaction ID, for example, provided via PUT /_matrix/client/v3/rooms/{roomId}/send/{eventType}/{txnId}, if the client being given the event is the same one which sent it."
]
}
}
{-| This message will be shown when a [Hashdict](Internal-Tools-Hashdict)
encounters a hash-value pair where the value does not hash to the provided hash.
-}
invalidHashInHashdict : String
invalidHashInHashdict =
"Invalid hash function: not all elements hash to their JSON-stored hashes"
{-| This message will be shown when a [Mashdict](Internal-Tools-Mashdict)
encounters a hash-value pair where the value does not hash to the provided hash.
-}
invalidHashInMashdict : String
invalidHashInMashdict =
"Invalid hash function: not all elements hash to their JSON-stored hashes"
{-| The Elm SDK occassionally uses [leaking values](Internal-Config-Leaks),
which might indicate exceptional behaviour. As such, this log is sent when one
of those leaking values is found: to alert the user that something fishy might
be going on.
-}
leakingValueFound : String -> String
leakingValueFound leaking_value =
"Found leaking value : " ++ leaking_value
{-| These logs might appear during a process where something unexpected has
happened. Most of these unexpected results, are taken account of by the Elm SDK,
but logged so that the programmer can do something about it.
-}
logs : { keyIsNotAnInt : String -> String }
logs =
{ keyIsNotAnInt =
\key ->
String.concat
[ "Encountered a key `"
, key
, "` that cannot be converted to an Int"
]
}
{-| Function descriptions
-}
mappings : { itokenPTR : TypeDocs }
mappings =
{ itokenPTR =
{ name = "ITokenPTR init"
, description =
[ "Converts an optional string to an Itoken pointer."
]
}
}
{-| Logs for issues that might be found while parsing strings into meaningful data.
-}
parses :
{ historicalUserId : String -> String
, reservedIPs :
{ ipv6Toipv4 : String
, multicast : String
, futureUse : String
, unspecified : String
}
}
parses =
{ historicalUserId = \name -> "Found a historical username `" ++ name ++ "`."
, reservedIPs =
{ ipv6Toipv4 = "Detected a reserved ip address that is formerly used as an IPv6 to IPv4 relay. It is unlikely that this IP Address is real."
, multicast = "Detected a reserved ip address that is used for multicasting. It is unlikely that this IP Address is real."
, futureUse = "Detected a reserves ip address that is reserved for future use. It is unlikely that this IP Address is real if you're running a recent version of the Elm SDK."
, unspecified = "This is an unspecified ip address. It is unlikely that this IP Address is real and someone might try to break something."
}
}
{-| The Matrix homeserver can specify how it wishes to communicate, and the Elm
SDK aims to communicate accordingly. This may fail in some scenarios, however,
in which case it will throw this error.

View File

@ -0,0 +1,350 @@
module Internal.Filter.Timeline exposing
( Filter
, pass, onlySenders, allSendersExcept, onlyTypes, allTypesExcept, fail
, match, run
, and
, subsetOf
, coder, encode, decoder
)
{-|
# Timeline filter
The timeline filter creates filters for looking through a timeline, as well as
for interacting with the Matrix API.
## Timeline
@docs Filter
## Create
@docs pass, onlySenders, allSendersExcept, onlyTypes, allTypesExcept, fail
## Filter
@docs match, run
## Combine
@docs and
## Compare
@docs subsetOf
## JSON coders
@docs coder, encode, decoder
-}
import Internal.Config.Text as Text
import Internal.Grammar.UserId as U
import Internal.Tools.Json as Json
import Json.Decode as D
import Json.Encode as E
import Set exposing (Set)
{-| Placeholder Event type so the real Event doesn't need to be imported.
-}
type alias Event a =
{ a | eventType : String, sender : U.UserID }
{-| The Timeline Filter filters events out of a timeline, guaranteeing that only
the events that meet the given criteria, meet the requirements.
-}
type Filter
= Filter
{ senders : Set String
, sendersAllowOthers : Bool
, types : Set String
, typesAllowOthers : Bool
}
{-| Allow events from all senders, except if they are on the provided list.
If the list is empty, all events are allowed.
-}
allSendersExcept : List String -> Filter
allSendersExcept senders =
case senders of
[] ->
pass
_ :: _ ->
Filter
{ senders = Set.fromList senders
, sendersAllowOthers = True
, types = Set.empty
, typesAllowOthers = True
}
{-| Allow events of every event type, except if they are on the provided list.
If the list is empty, all events are allowed.
-}
allTypesExcept : List String -> Filter
allTypesExcept types =
case types of
[] ->
pass
_ :: _ ->
Filter
{ senders = Set.empty
, sendersAllowOthers = True
, types = Set.fromList types
, typesAllowOthers = True
}
{-| Only allow an event if it meets the criteria of two Filters.
-}
and : Filter -> Filter -> Filter
and (Filter f1) (Filter f2) =
let
stdAnd : Filter
stdAnd =
Filter
{ senders =
case ( f1.sendersAllowOthers, f2.sendersAllowOthers ) of
( True, True ) ->
Set.union f1.senders f2.senders
( True, False ) ->
Set.diff f2.senders f1.senders
( False, True ) ->
Set.diff f1.senders f2.senders
( False, False ) ->
Set.intersect f1.senders f2.senders
, sendersAllowOthers = f1.sendersAllowOthers && f2.sendersAllowOthers
, types =
case ( f1.typesAllowOthers, f2.typesAllowOthers ) of
( True, True ) ->
Set.union f1.types f2.types
( True, False ) ->
Set.diff f2.types f1.types
( False, True ) ->
Set.diff f1.types f2.types
( False, False ) ->
Set.intersect f1.types f2.types
, typesAllowOthers = f1.typesAllowOthers && f2.typesAllowOthers
}
in
case stdAnd of
Filter f ->
if Set.isEmpty f.senders && not f.sendersAllowOthers then
fail
else if Set.isEmpty f.types && not f.typesAllowOthers then
fail
else
stdAnd
{-| Define how to encode and decode a Timeline Filter to and from a JSON value.
-}
coder : Json.Coder Filter
coder =
Json.object4
{ name = Text.docs.timelineFilter.name
, description = Text.docs.timelineFilter.description
, init =
\a b c d ->
Filter
{ senders = a
, sendersAllowOthers = b
, types = c
, typesAllowOthers = d
}
}
(Json.field.optional.withDefault
{ fieldName = "senders"
, toField = \(Filter f) -> f.senders
, description = Text.fields.timelineFilter.senders
, coder = Json.set Json.string
, default = ( Set.empty, [] )
, defaultToString = always "[]"
}
)
(Json.field.required
{ fieldName = "sendersAllowOthers"
, toField = \(Filter f) -> f.sendersAllowOthers
, description = Text.fields.timelineFilter.sendersAllowOthers
, coder = Json.bool
}
)
(Json.field.optional.withDefault
{ fieldName = "types"
, toField = \(Filter f) -> f.types
, description = Text.fields.timelineFilter.types
, coder = Json.set Json.string
, default = ( Set.empty, [] )
, defaultToString = always "[]"
}
)
(Json.field.required
{ fieldName = "typesAllowOthers"
, toField = \(Filter f) -> f.typesAllowOthers
, description = Text.fields.timelineFilter.typesAllowOthers
, coder = Json.bool
}
)
{-| Decode a Filter from a JSON value.
-}
decoder : Json.Decoder Filter
decoder =
Json.decode coder
{-| Encode a Filter into a JSON value.
-}
encode : Json.Encoder Filter
encode =
Json.encode coder
{-| Allow no events. This filter is likely quite useless in practice, but it is
used in the test environment for sanity checks and comparisons.
-}
fail : Filter
fail =
Filter
{ senders = Set.empty
, sendersAllowOthers = False
, types = Set.empty
, typesAllowOthers = False
}
{-| Determine whether an event passes a filter.
-}
match : Filter -> Event a -> Bool
match (Filter f) { eventType, sender } =
let
mentionedSender : Bool
mentionedSender =
Set.member (U.toString sender) f.senders
mentionedType : Bool
mentionedType =
Set.member eventType f.types
in
xor mentionedSender f.sendersAllowOthers
&& xor mentionedType f.typesAllowOthers
{-| Only allow event sent by given senders.
If an empty list is given, no events are allowed.
-}
onlySenders : List String -> Filter
onlySenders senders =
case senders of
[] ->
fail
_ :: _ ->
Filter
{ senders = Set.fromList senders
, sendersAllowOthers = False
, types = Set.empty
, typesAllowOthers = True
}
{-| Only allow events of a given event type.
If an empty list is given, no events are allowed.
-}
onlyTypes : List String -> Filter
onlyTypes types =
case types of
[] ->
fail
_ :: _ ->
Filter
{ senders = Set.empty
, sendersAllowOthers = True
, types = Set.fromList types
, typesAllowOthers = False
}
{-| Create a filter that allows all events. This can be useful when trying to
combine multiple filters, or when simply all events are allowed.
-}
pass : Filter
pass =
Filter
{ senders = Set.empty
, sendersAllowOthers = True
, types = Set.empty
, typesAllowOthers = True
}
{-| Use a filter on a list of events.
-}
run : Filter -> List (Event a) -> List (Event a)
run f events =
List.filter (match f) events
{-| Determine whether the second argument is a subset filter of the first
argument.
-}
subsetOf : Filter -> Filter -> Bool
subsetOf (Filter big) (Filter small) =
let
isSSof : Set String -> Set String -> Bool
isSSof b s =
Set.intersect b s == s
isSubsetFor : ( Bool, Set String ) -> ( Bool, Set String ) -> Bool
isSubsetFor ( bb, sb ) ( bs, ss ) =
case ( bb, bs ) of
( True, True ) ->
isSSof ss sb
( True, False ) ->
Set.isEmpty (Set.intersect sb ss)
( False, True ) ->
False
( False, False ) ->
isSSof sb ss
in
isSubsetFor
( big.sendersAllowOthers, big.senders )
( small.sendersAllowOthers, small.senders )
&& isSubsetFor
( big.typesAllowOthers, big.types )
( small.typesAllowOthers, small.types )

View File

@ -0,0 +1,279 @@
module Internal.Grammar.ServerName exposing
( ServerName, toString, fromString
, serverNameParser
, HostName(..)
)
{-|
# Server name
A homeserver is uniquely identified by its server name. The server name
represents the address at which the homeserver in question can be reached by
other homeservers.
@docs ServerName, toString, fromString
## Parser
@docs serverNameParser
## Debug
@docs HostName
-}
import Internal.Tools.ParserExtra as PE
import Parser as P exposing ((|.), (|=), Parser)
{-| The hostname is the location where the server can be found.
Notice how the Matrix spec specifies that the hostname can either be a DNS name,
an IPv4Address or an IPv6Address. Since the IPv4Address is compatible with the
specification of DNS names, however, and RFC1123 (section 2.1) does not require
a client to distinguish them, we treat IPv4Addresses like DNS names.
-}
type HostName
= DNS String
| IPv6 IPv6Address
{-| The IPv6Address is represented by a list of items BEFORE and AFTER the
double colons (::).
-}
type alias IPv6Address =
{ front : List String, back : List String }
{-| The server name is a combination of a hostname and an optional port.
-}
type alias ServerName =
{ host : HostName, port_ : Maybe Int }
{-| Parser for the DNS name record. The Matrix spec bases its grammar on the
standard for internet host names, as specified by RFC1123, section 2.1, with an
extension IPv6 literals.
[RFC-1123 §2.2]
The syntax of a legal Internet host name was specified in RFC-952
[DNS:4]. One aspect of host name syntax is hereby changed: the
restriction on the first character is relaxed to allow either a
letter or a digit. Host software MUST support this more liberal
syntax.
Host software MUST handle host names of up to 63 characters and
SHOULD handle host names of up to 255 characters.
[RFC-952 §Assumptions-1]
A "name" (Net, Host, Gateway, or Domain name) is a text string up
to 24 characters drawn from the alphabet (A-Z), digits (0-9), minus
sign (-), and period (.). Note that periods are only allowed when
they serve to delimit components of "domain style names". (See
RFC-921, "Domain Name System Implementation Schedule", for
background).
-}
dnsNameParser : Parser String
dnsNameParser =
P.chompIf Char.isAlphaNum
|. P.chompWhile (\c -> Char.isAlphaNum c || c == '-' || c == '.')
|> P.getChompedString
{-| Convert a string to a server name.
-}
fromString : String -> Maybe ServerName
fromString s =
P.run (serverNameParser |. P.end) s
|> (\out ->
case out of
Ok _ ->
out
Err e ->
Debug.log "No parse" e
|> always (Debug.log "original" s)
|> always out
)
|> Result.toMaybe
{-| Parse a Hostname.
-}
hostnameParser : Parser HostName
hostnameParser =
P.oneOf
[ P.succeed IPv6
|. P.symbol "["
|= ipv6Parser
|. P.symbol "]"
, P.succeed DNS
|= dnsNameParser
]
{-| Parse all values to the left of the double colon (::)
-}
ipv6LeftParser : Parser (List String)
ipv6LeftParser =
P.oneOf
[ P.succeed []
|. P.symbol ":"
, P.succeed (|>)
|= PE.times 1 7 (ipv6NumParser |. P.symbol ":")
|= P.oneOf
[ P.succeed (\bottom tail -> tail ++ [ bottom ])
|= ipv6NumParser
, P.succeed identity
]
]
{-| Parse an ordinary IPv6 number
-}
ipv6NumParser : Parser String
ipv6NumParser =
P.chompIf Char.isHexDigit
|> P.getChompedString
|> PE.times 1 4
|> P.map String.concat
{-| Parse an IPv6 Address
-}
ipv6Parser : Parser IPv6Address
ipv6Parser =
ipv6LeftParser
|> P.andThen
(\front ->
if List.length front < 8 then
P.succeed (IPv6Address front)
|= ipv6RightParser (8 - 1 - List.length front)
-- The -1 is because :: implies one or more zeroes
else
P.succeed (IPv6Address front [])
)
{-| Parse all values to the right of the double colon (::)
-}
ipv6RightParser : Int -> Parser (List String)
ipv6RightParser n =
if n > 0 then
P.succeed identity
|. P.symbol ":"
|= P.oneOf
[ P.succeed (::)
|= ipv6NumParser
|= PE.times 0
(n - 1)
(P.succeed identity
|. P.symbol ":"
|= ipv6NumParser
)
, P.succeed []
]
else
P.succeed []
|. P.symbol ":"
{-| Convert an IPv6 address to a readable string format
-}
ipv6ToString : IPv6Address -> String
ipv6ToString { front, back } =
(if List.length front == 8 then
front
else if List.length back == 8 then
back
else
List.concat [ front, [ "" ], back ]
)
|> List.intersperse ":"
|> String.concat
portParser : Parser Int
portParser =
P.chompIf Char.isDigit
|. P.chompWhile Char.isDigit
|> P.getChompedString
|> P.andThen
(\v ->
case String.toInt v of
Just i ->
if 0 <= i && i <= 2 ^ 16 - 1 then
P.succeed i
else
P.problem ("Port out of range: " ++ v)
Nothing ->
P.problem "Not a port number"
)
{-| Parse a server name. Generally used by other identifiers that have a server
name as one of its parts.
-}
serverNameParser : Parser ServerName
serverNameParser =
P.succeed ServerName
|= hostnameParser
|= P.oneOf
[ P.succeed Just
|. P.symbol ":"
|= portParser
, P.succeed Nothing
]
{-| Convert a parsed server name back to a string.
-}
toString : ServerName -> String
toString { host, port_ } =
let
hostString : String
hostString =
case host of
DNS name ->
name
IPv6 { front, back } ->
(if List.length front == 8 then
List.intersperse ":" front
else if List.length back == 8 then
List.intersperse ":" back
else
List.concat
[ List.intersperse ":" front
, [ "::" ]
, List.intersperse ":" back
]
)
|> String.concat
|> (\i -> "[" ++ i ++ "]")
portString : String
portString =
port_
|> Maybe.map String.fromInt
|> Maybe.map ((++) ":")
|> Maybe.withDefault ""
in
hostString ++ portString

View File

@ -0,0 +1,128 @@
module Internal.Grammar.UserId exposing
( UserID, toString, fromString
, userIdParser, isHistorical
)
{-|
# User ids
Users within Matrix are uniquely identified by their Matrix user ID. The user
ID is namespaced to the homeserver which allocated the account and has the form:
@localpart:domain
The localpart of a user ID is an opaque identifier for that user. It MUST NOT
be empty, and MUST contain only the characters a-z, 0-9, ., \_, =, -, /, and +.
The domain of a user ID is the server name of the homeserver which allocated
the account.
The length of a user ID, including the @ sigil and the domain, MUST NOT exceed
255 characters.
The complete grammar for a legal user ID is:
user_id = "@" user_id_localpart ":" server_name
user_id_localpart = 1*user_id_char
user_id_char = DIGIT
/ %x61-7A ; a-z
/ "-" / "." / "=" / "_" / "/" / "+"
Older versions of this specification were more tolerant of the characters
permitted in user ID localparts. There are currently active users whose user
IDs do not conform to the permitted character set, and a number of rooms whose
history includes events with a sender which does not conform. In order to
handle these rooms successfully, clients and servers MUST accept user IDs with
localparts from the expanded character set:
extended_user_id_char = %x21-39 / %x3B-7E ; all ASCII printing chars except :
## User ID
@docs UserID, toString, fromString
## Extra
@docs userIdParser, isHistorical
-}
import Internal.Grammar.ServerName as ServerName exposing (ServerName)
import Internal.Tools.ParserExtra as PE
import Parser as P exposing ((|.), (|=), Parser)
{-| The User ID type defining a user.
-}
type alias UserID =
{ localpart : String, domain : ServerName }
{-| Convert a Matrix User ID back into its uniquely identifying string.
-}
fromString : String -> Maybe UserID
fromString =
P.run (userIdParser |. P.end) >> Result.toMaybe
{-| Return a boolean on whether a Matrix user has a historical user ID.
Since this user ID is not SUPPOSED to be legal but clients are nevertheless
forced to support them due to backwards compatibility, clients may occasionally
attempt to break the rules in an attempt to find undefined behaviour.
As a result, an explicit method to spot historical users is added to the SDK.
-}
isHistorical : UserID -> Bool
isHistorical { localpart } =
String.any
(\c ->
let
i : Int
i =
Char.toCode c
in
not ((0x61 <= i && i <= 0x7A) || Char.isAlpha c)
)
localpart
localpartParser : Parser String
localpartParser =
P.chompIf validHistoricalUsernameChar
|> P.getChompedString
|> PE.times 1 255
|> P.map String.concat
{-| Convert a parsed User ID to a string.
-}
toString : UserID -> String
toString { localpart, domain } =
String.concat [ "@", localpart, ":", ServerName.toString domain ]
{-| Parse a UserID from a string.
-}
userIdParser : Parser UserID
userIdParser =
P.succeed UserID
|. P.symbol "@"
|= localpartParser
|. P.symbol ":"
|= ServerName.serverNameParser
|> PE.maxLength 255
validHistoricalUsernameChar : Char -> Bool
validHistoricalUsernameChar c =
let
i : Int
i =
Char.toCode c
in
(0x21 <= i && i <= 0x39) || (0x3B <= i && i <= 0x7E)

View File

@ -1,4 +1,4 @@
module Internal.Tools.Decode exposing
module Internal.Tools.DecodeExtra exposing
( opField, opFieldWithDefault
, map9, map10, map11
)

View File

@ -1,4 +1,4 @@
module Internal.Tools.Encode exposing (maybeObject)
module Internal.Tools.EncodeExtra exposing (maybeObject)
{-|

View File

@ -1,74 +0,0 @@
module Internal.Tools.Grammar exposing (..)
{-|
# Identifier Grammar
The specification defines
[some identifiers](https://spec.matrix.org/v1.9/appendices/#identifier-grammar)
to use the Common Namespaced Identifier Grammar. This is a common grammar
intended for non-user-visible identifiers, with a defined mechanism for
implementations to create new identifiers.
This module documents those identifiers, allowing the Elm SDK to use them.
-}
import Parser as P exposing (Parser)
{-| Parse an IPv6 address
-}
ipv6addressParser : Parser String
ipv6addressParser =
P.chompWhile validIPv6Char
|> P.getChompedString
|> P.andThen
(\out ->
if String.length out > 45 then
P.problem "an ipv6 address has no more than 45 digits"
else if String.length out < 2 then
P.problem "an ipv6 address has at least 2 digits"
else
-- TODO: ipv6 has more specific rules
-- https://datatracker.ietf.org/doc/html/rfc3513#section-2.2
P.succeed out
)
{-| Parse a port value
-}
portParser : Parser Int
portParser =
P.chompWhile Char.isDigit
|> P.getChompedString
|> P.andThen
(\out ->
if String.length out > 5 then
P.problem "a port has no more than 5 digits"
else if String.length out < 1 then
P.problem "a port has at least 1 digit"
else
case String.toInt out of
Nothing ->
P.problem "Expected port int"
Just i ->
P.succeed i
)
{-| Check whether a char is a valid IPv6char
-}
validIPv6char : Char -> Bool
validIPv6Char c =
"0123456789ABCDEFabcdef:."
|> String.toList
|> List.member c

View File

@ -3,8 +3,8 @@ module Internal.Tools.Hashdict exposing
, empty, singleton, insert, remove, removeKey
, isEmpty, member, memberKey, get, size, isEqual
, keys, values, toList, fromList
, rehash, union
, encode, decoder, softDecoder
, rehash, union, map
, coder, encode, decoder, softDecoder
)
{-| This module abstracts the `Dict` type with one function that assigns a
@ -35,18 +35,19 @@ This allows you to store values based on an externally defined identifier.
## Transform
@docs rehash, union
@docs rehash, union, map
## JSON coders
@docs encode, decoder, softDecoder
@docs coder, encode, decoder, softDecoder
-}
import FastDict as Dict exposing (Dict)
import Json.Decode as D
import Json.Encode as E
import Internal.Config.Log as Log
import Internal.Config.Text as Text
import Internal.Tools.Json as Json
{-| A dictionary of keys and values where each key is defined by its value. For
@ -80,25 +81,43 @@ type Hashdict a
}
{-| Define how Hashdict can be encoded to and decoded from a JSON object.
-}
coder : (a -> String) -> Json.Coder a -> Json.Coder (Hashdict a)
coder f c1 =
Json.andThen
{ name = Text.docs.hashdict.name
, description = Text.docs.hashdict.description
, forth =
-- TODO: Implement fastDictWithFilter function
\items ->
case List.filter (\( k, v ) -> f v /= k) (Dict.toList items) of
[] ->
{ hash = f, values = items }
|> Hashdict
|> Json.succeed
|> (|>) []
wrongHashes ->
wrongHashes
|> List.map Tuple.first
|> List.map ((++) "Invalid hash")
|> List.map Log.log.error
|> Json.fail Text.invalidHashInHashdict
, back = \(Hashdict h) -> h.values
, failure =
Text.failures.hashdict
}
(Json.fastDict c1)
{-| Decode a hashdict from a JSON value. To create a hashdict, you are expected
to insert a hash function. If the hash function doesn't properly hash the values
as expected, the decoder will fail to decode the hashdict.
-}
decoder : (a -> String) -> D.Decoder a -> D.Decoder (Hashdict a)
decoder f xDecoder =
D.keyValuePairs xDecoder
|> D.andThen
(\items ->
if List.all (\( hash, value ) -> f value == hash) items then
items
|> Dict.fromList
|> (\d -> { hash = f, values = d })
|> Hashdict
|> D.succeed
else
D.fail "Hash function fails to properly hash all values"
)
decoder : (a -> String) -> Json.Coder a -> Json.Decoder (Hashdict a)
decoder f c1 =
Json.decode (coder f c1)
{-| Create an empty hashdict.
@ -112,12 +131,9 @@ empty hash =
cannot be universally converted to JSON, so it is up to you to preserve that
hash function!
-}
encode : (a -> E.Value) -> Hashdict a -> E.Value
encode encodeX (Hashdict h) =
h.values
|> Dict.toList
|> List.map (Tuple.mapSecond encodeX)
|> E.object
encode : Json.Coder a -> Json.Encoder (Hashdict a)
encode c1 (Hashdict h) =
Json.encode (coder h.hash c1) (Hashdict h)
{-| Convert an association list into a hashdict.
@ -173,6 +189,34 @@ keys (Hashdict h) =
Dict.keys h.values
{-| Map a value on a given key. If the outcome of the function changes the hash,
the operation does nothing.
-}
map : String -> (a -> a) -> Hashdict a -> Hashdict a
map key f (Hashdict h) =
Hashdict
{ h
| values =
Dict.update
key
(Maybe.map
(\value ->
let
newValue : a
newValue =
f value
in
if h.hash newValue == h.hash value then
newValue
else
value
)
)
h.values
}
{-| Determine if a value's hash is in a hashdict.
-}
member : a -> Hashdict a -> Bool
@ -240,10 +284,20 @@ size (Hashdict h) =
used hash function, (or if you simply do not care) you can use this function to
decode and rehash the Hashdict using your new hash function.
-}
softDecoder : (a -> String) -> D.Decoder a -> D.Decoder (Hashdict a)
softDecoder f xDecoder =
D.keyValuePairs xDecoder
|> D.map (List.map Tuple.second >> fromList f)
softDecoder : (a -> String) -> Json.Coder a -> Json.Decoder (Hashdict a)
softDecoder f c1 =
c1
|> Json.fastDict
|> Json.map
{ name = Text.docs.hashdict.name
, description = Text.docs.hashdict.description
, forth =
\items ->
Hashdict { hash = f, values = items }
|> rehash f
, back = \(Hashdict h) -> h.values
}
|> Json.decode
{-| Convert a hashdict into an association list of key-value pairs, sorted by

View File

@ -3,7 +3,7 @@ module Internal.Tools.Iddict exposing
, empty, singleton, insert, map, remove
, isEmpty, member, get, size
, keys, values
, encode, decoder
, coder, encode, decoder
)
{-| The id-dict is a data type that lets us store values in a dictionary using
@ -36,13 +36,13 @@ do not need to generate identifiers yourself.
## JSON coders
@docs encode, decoder
@docs coder, encode, decoder
-}
import FastDict as Dict exposing (Dict)
import Json.Decode as D
import Json.Encode as E
import Internal.Config.Text as Text
import Internal.Tools.Json as Json
{-| The Iddict data type.
@ -54,41 +54,49 @@ type Iddict a
}
{-| Define how an Iddict can be encoded and decoded to and from a JSON value.
-}
coder : Json.Coder a -> Json.Coder (Iddict a)
coder x =
Json.object2
{ name = Text.docs.iddict.name
, description = Text.docs.iddict.description
, init =
\c d ->
Iddict
{ cursor =
Dict.keys d
|> List.maximum
|> Maybe.map ((+) 1)
|> Maybe.withDefault 0
|> max (Dict.size d)
|> max c
, dict = d
}
}
(Json.field.optional.withDefault
{ fieldName = "cursor"
, toField = \(Iddict i) -> i.cursor
, description = Text.fields.iddict.cursor
, coder = Json.int
, default = ( 0, [] )
, defaultToString = String.fromInt
}
)
(Json.field.required
{ fieldName = "dict"
, toField = \(Iddict i) -> i.dict
, description = Text.fields.iddict.dict
, coder = Json.fastIntDict x
}
)
{-| Decode an id-dict from a JSON value.
-}
decoder : D.Decoder a -> D.Decoder (Iddict a)
decoder xDecoder =
D.map2
(\c pairs ->
let
dict : Dict Int a
dict =
pairs
|> List.filterMap
(\( k, v ) ->
k
|> String.toInt
|> Maybe.map (\n -> ( n, v ))
)
|> Dict.fromList
in
Iddict
{ cursor =
Dict.keys dict
-- Larger than all values in the list
|> List.map ((+) 1)
|> List.maximum
|> Maybe.withDefault 0
|> max (Dict.size dict)
-- At least the dict size
|> max c
-- At least the given value
, dict = dict
}
)
(D.field "cursor" D.int)
(D.field "dict" <| D.keyValuePairs xDecoder)
decoder : Json.Coder a -> Json.Decoder (Iddict a)
decoder x =
Json.decode (coder x)
{-| Create an empty id-dict.
@ -103,16 +111,9 @@ empty =
{-| Encode an id-dict to a JSON value.
-}
encode : (a -> E.Value) -> Iddict a -> E.Value
encode encodeX (Iddict d) =
E.object
[ ( "cursor", E.int d.cursor )
, ( "dict"
, d.dict
|> Dict.toCoreDict
|> E.dict String.fromInt encodeX
)
]
encode : Json.Coder a -> Json.Encoder (Iddict a)
encode x =
Json.encode (coder x)
{-| Get a value from the id-dict using its key.

1277
src/Internal/Tools/Json.elm Normal file

File diff suppressed because it is too large Load Diff

View File

@ -3,8 +3,8 @@ module Internal.Tools.Mashdict exposing
, empty, singleton, insert, remove, removeKey
, isEmpty, member, memberKey, get, size, isEqual
, keys, values, toList, fromList
, rehash, union
, encode, decoder, softDecoder
, rehash, union, map
, coder, encode, decoder, softDecoder
)
{-|
@ -43,18 +43,19 @@ In general, you are advised to learn more about the
## Transform
@docs rehash, union
@docs rehash, union, map
## JSON coders
@docs encode, decoder, softDecoder
@docs coder, encode, decoder, softDecoder
-}
import FastDict as Dict exposing (Dict)
import Json.Decode as D
import Json.Encode as E
import Internal.Config.Log as Log
import Internal.Config.Text as Text
import Internal.Tools.Json as Json
{-| A dictionary of keys and values where each key is defined by its value, but
@ -92,25 +93,41 @@ type Mashdict a
}
{-| Define how a Mashdict can be encoded to and decoded from a JSON object.
-}
coder : (a -> Maybe String) -> Json.Coder a -> Json.Coder (Mashdict a)
coder f c1 =
Json.andThen
{ name = Text.docs.mashdict.name
, description = Text.docs.mashdict.description
, forth =
\items ->
case List.filter (\( k, v ) -> f v /= Just k) (Dict.toList items) of
[] ->
{ hash = f, values = items }
|> Mashdict
|> Json.succeed
|> (|>) []
wrongHashes ->
wrongHashes
|> List.map Tuple.first
|> List.map ((++) "Invalid hash")
|> List.map Log.log.error
|> Json.fail Text.invalidHashInMashdict
, back = \(Mashdict h) -> h.values
, failure = Text.failures.mashdict
}
(Json.fastDict c1)
{-| Decode a mashdict from a JSON value. To create a mashdict, you are expected
to insert a hash function. If the hash function doesn't properly hash the values
as expected, the decoder will fail to decode the mashdict.
-}
decoder : (a -> Maybe String) -> D.Decoder a -> D.Decoder (Mashdict a)
decoder f xDecoder =
D.keyValuePairs xDecoder
|> D.andThen
(\items ->
if List.all (\( hash, value ) -> f value == Just hash) items then
items
|> Dict.fromList
|> (\d -> { hash = f, values = d })
|> Mashdict
|> D.succeed
else
D.fail "Hash function fails to properly hash all values"
)
decoder : (a -> Maybe String) -> Json.Coder a -> Json.Decoder (Mashdict a)
decoder f c1 =
Json.decode (coder f c1)
{-| Create an empty mashdict.
@ -124,12 +141,9 @@ empty hash =
cannot be universally converted to JSON, so it is up to you to preserve that
hash function!
-}
encode : (a -> E.Value) -> Mashdict a -> E.Value
encode encodeX (Mashdict h) =
h.values
|> Dict.toList
|> List.map (Tuple.mapSecond encodeX)
|> E.object
encode : Json.Coder a -> Json.Encoder (Mashdict a)
encode c1 (Mashdict h) =
Json.encode (coder h.hash c1) (Mashdict h)
{-| Convert an association list into a mashdict.
@ -191,6 +205,34 @@ keys (Mashdict h) =
Dict.keys h.values
{-| Map a value on a given key. If the outcome of the function changes the hash,
the operation does nothing.
-}
map : String -> (a -> a) -> Mashdict a -> Mashdict a
map key f (Mashdict h) =
Mashdict
{ h
| values =
Dict.update
key
(Maybe.map
(\value ->
case h.hash (f value) of
Just newHash ->
if newHash == key then
f value
else
value
Nothing ->
value
)
)
h.values
}
{-| Determine if a value's hash is in a mashdict.
-}
member : a -> Mashdict a -> Bool
@ -266,10 +308,20 @@ size (Mashdict h) =
used hash function, (or if you simply do not care) you can use this function to
decode and rehash the Mashdict using your new hash function.
-}
softDecoder : (a -> Maybe String) -> D.Decoder a -> D.Decoder (Mashdict a)
softDecoder f xDecoder =
D.keyValuePairs xDecoder
|> D.map (List.map Tuple.second >> fromList f)
softDecoder : (a -> Maybe String) -> Json.Coder a -> Json.Decoder (Mashdict a)
softDecoder f c1 =
c1
|> Json.fastDict
|> Json.map
{ name = Text.docs.hashdict.name
, description = Text.docs.hashdict.description
, forth =
\items ->
Mashdict { hash = f, values = items }
|> rehash f
, back = \(Mashdict h) -> h.values
}
|> Json.decode
{-| Convert a mashdict into an association list of key-value pairs, sorted by

View File

@ -0,0 +1,105 @@
module Internal.Tools.ParserExtra exposing (..)
import Parser as P exposing ((|.), (|=), Parser)
zeroOrMore : Parser a -> Parser (List a)
zeroOrMore parser =
P.loop []
(\tail ->
P.oneOf
[ P.succeed (\head -> P.Loop (head :: tail))
|= parser
, P.succeed (P.Done (List.reverse tail))
]
)
oneOrMore : Parser a -> Parser (List a)
oneOrMore parser =
P.succeed (::)
|= parser
|= zeroOrMore parser
atLeast : Int -> Parser a -> Parser (List a)
atLeast n parser =
P.loop []
(\tail ->
if List.length tail < n then
P.succeed (\head -> P.Loop (head :: tail))
|= parser
else
P.oneOf
[ P.succeed (\head -> P.Loop (head :: tail))
|= parser
, P.succeed (P.Done (List.reverse tail))
]
)
atMost : Int -> Parser a -> Parser (List a)
atMost n parser =
P.loop []
(\tail ->
if List.length tail < n then
P.oneOf
[ P.succeed (\head -> P.Loop (head :: tail))
|= parser
, P.succeed (P.Done (List.reverse tail))
]
else
P.succeed (P.Done (List.reverse tail))
)
times : Int -> Int -> Parser a -> Parser (List a)
times inf sup parser =
let
low : Int
low =
max 0 (min inf sup)
high : Int
high =
max 0 sup
in
P.loop []
(\tail ->
if List.length tail < low then
P.succeed (\head -> P.Loop (head :: tail))
|= parser
else if List.length tail < high then
P.oneOf
[ P.succeed (\head -> P.Loop (head :: tail))
|= parser
, P.succeed (P.Done (List.reverse tail))
]
else
P.succeed (P.Done (List.reverse tail))
)
exactly : Int -> Parser a -> Parser (List a)
exactly n =
times n n
maxLength : Int -> Parser a -> Parser a
maxLength n parser =
P.succeed
(\start value end ->
if abs (end - start) > n then
P.problem "Parsed too much text!"
else
P.succeed value
)
|= P.getOffset
|= parser
|= P.getOffset
|> P.andThen identity

View File

@ -1,6 +1,6 @@
module Internal.Tools.Timestamp exposing
( Timestamp
, encode, decoder
, coder, encode, decoder
)
{-| The Timestamp module is a simplification of the Timestamp as delivered by
@ -14,12 +14,11 @@ elm/time. This module offers ways to work with the timestamp in meaningful ways.
## JSON coders
@docs encode, decoder
@docs coder, encode, decoder
-}
import Json.Decode as D
import Json.Encode as E
import Internal.Tools.Json as Json
import Time
@ -29,15 +28,30 @@ type alias Timestamp =
Time.Posix
{-| Create a Json coder
-}
coder : Json.Coder Timestamp
coder =
Json.map
{ back = Time.posixToMillis
, forth = Time.millisToPosix
, name = "Milliseconds to POSIX"
, description =
[ "Converts the timestamp from milliseconds to a POSIX timestamp."
]
}
Json.int
{-| Encode a timestamp into a JSON value.
-}
encode : Timestamp -> E.Value
encode : Json.Encoder Timestamp
encode =
Time.posixToMillis >> E.int
Json.encode coder
{-| Decode a timestamp from a JSON value.
-}
decoder : D.Decoder Timestamp
decoder : Json.Decoder Timestamp
decoder =
D.map Time.millisToPosix D.int
Json.decode coder

View File

@ -1,5 +1,5 @@
module Internal.Values.Context exposing
( Context, init, encode, decoder
( Context, init, coder, encode, decoder
, APIContext, apiFormat
, setAccessToken, getAccessToken
, setBaseUrl, getBaseUrl
@ -14,7 +14,7 @@ the Matrix API.
## Context
@docs Context, init, encode, decoder
@docs Context, init, coder, encode, decoder
## APIContext
@ -50,10 +50,8 @@ information that can be inserted.
-}
import Internal.Config.Leaks as L
import Internal.Tools.Decode as D
import Internal.Tools.Encode as E
import Json.Decode as D
import Json.Encode as E
import Internal.Config.Text as Text
import Internal.Tools.Json as Json
{-| The Context type stores all the information in the Vault. This data type is
@ -97,33 +95,78 @@ apiFormat context =
}
{-| Define how a Context can be encoded to and decoded from a JSON object.
-}
coder : Json.Coder Context
coder =
Json.object7
{ name = Text.docs.context.name
, description = Text.docs.context.description
, init = Context
}
(Json.field.optional.value
{ fieldName = "accessToken"
, toField = .accessToken
, description = Text.fields.context.accessToken
, coder = Json.string
}
)
(Json.field.optional.value
{ fieldName = "baseUrl"
, toField = .baseUrl
, description = Text.fields.context.baseUrl
, coder = Json.string
}
)
(Json.field.optional.value
{ fieldName = "password"
, toField = .password
, description = Text.fields.context.password
, coder = Json.string
}
)
(Json.field.optional.value
{ fieldName = "refreshToken"
, toField = .refreshToken
, description = Text.fields.context.refreshToken
, coder = Json.string
}
)
(Json.field.optional.value
{ fieldName = "username"
, toField = .username
, description = Text.fields.context.username
, coder = Json.string
}
)
(Json.field.optional.value
{ fieldName = "transaction"
, toField = .transaction
, description = Text.fields.context.transaction
, coder = Json.string
}
)
(Json.field.optional.value
{ fieldName = "versions"
, toField = .versions
, description = Text.fields.context.versions
, coder = Json.list Json.string
}
)
{-| Decode a Context type from a JSON value.
-}
decoder : D.Decoder Context
decoder : Json.Decoder Context
decoder =
D.map7 Context
(D.opField "accessToken" D.string)
(D.opField "baseUrl" D.string)
(D.opField "password" D.string)
(D.opField "refreshToken" D.string)
(D.opField "username" D.string)
(D.opField "transaction" D.string)
(D.opField "versions" (D.list D.string))
Json.decode coder
{-| Encode a Context type into a JSON value.
-}
encode : Context -> E.Value
encode context =
E.maybeObject
[ ( "accessToken", Maybe.map E.string context.accessToken )
, ( "baseUrl", Maybe.map E.string context.baseUrl )
, ( "password", Maybe.map E.string context.password )
, ( "refreshToken", Maybe.map E.string context.refreshToken )
, ( "username", Maybe.map E.string context.username )
, ( "transaction", Maybe.map E.string context.transaction )
, ( "versions", Maybe.map (E.list E.string) context.versions )
]
encode : Json.Encoder Context
encode =
Json.encode coder
{-| A basic, untouched version of the Context, containing no information.

View File

@ -4,7 +4,7 @@ module Internal.Values.Envelope exposing
, Settings, mapSettings, extractSettings
, mapContext
, getContent, extract
, encode, decoder
, coder, encode, decoder
)
{-| The Envelope module wraps existing data types with lots of values and
@ -38,17 +38,14 @@ settings that can be adjusted manually.
## JSON coders
@docs encode, decoder
@docs coder, encode, decoder
-}
import Internal.Config.Default as Default
import Internal.Tools.Decode as D
import Internal.Tools.Encode as E
import Internal.Config.Text as Text
import Internal.Tools.Json as Json
import Internal.Values.Context as Context exposing (Context)
import Internal.Values.Settings as Settings
import Json.Decode as D
import Json.Encode as E
{-| There are lots of different data types in the Elm SDK, and many of them
@ -71,28 +68,54 @@ type alias Settings =
Settings.Settings
{-| Define how an Envelope can be encoded to and decoded from a JSON object.
-}
coder : Json.Coder a -> Json.Coder (Envelope a)
coder c1 =
Json.object3
{ name = Text.docs.envelope.name
, description = Text.docs.envelope.description
, init = Envelope
}
(Json.field.required
{ fieldName = "content"
, toField = .content
, description = Text.fields.envelope.content
, coder = c1
}
)
(Json.field.required
{ fieldName = "context"
, toField = .context
, description = Text.fields.envelope.context
, coder = Context.coder
}
)
(Json.field.optional.withDefault
{ fieldName = "settings"
, toField = .settings
, description = Text.fields.envelope.settings
, coder = Settings.coder
, default = Tuple.pair Settings.init []
, defaultToString = always "<Default settings>"
}
)
{-| Decode an enveloped type from a JSON value. The decoder also imports any
potential tokens, values and settings included in the JSON.
-}
decoder : D.Decoder a -> D.Decoder (Envelope a)
decoder xDecoder =
D.map3 Envelope
(D.field "content" xDecoder)
(D.field "context" Context.decoder)
(D.field "settings" Settings.decoder)
decoder : Json.Coder a -> Json.Decoder (Envelope a)
decoder c1 =
Json.decode (coder c1)
{-| Encode an enveloped type into a JSON value. The function encodes all
non-standard settings, tokens and values.
-}
encode : (a -> E.Value) -> Envelope a -> E.Value
encode encodeX data =
E.object
[ ( "content", encodeX data.content )
, ( "context", Context.encode data.context )
, ( "settings", Settings.encode data.settings )
, ( "version", E.string Default.currentVersion )
]
encode : Json.Coder a -> Json.Encoder (Envelope a)
encode c1 =
Json.encode (coder c1)
{-| Map a function, then get its content. This is useful for getting information

View File

@ -1,7 +1,8 @@
module Internal.Values.Event exposing
( Event
, UnsignedData(..), age, prevContent, redactedBecause, transactionId
, encode, decoder
, coder, encode, decoder
, isEqual
)
{-|
@ -22,26 +23,30 @@ of a room.
## JSON Coder
@docs encode, decoder
@docs coder, encode, decoder
## Test functions
@docs isEqual
-}
import Internal.Config.Default as Default
import Internal.Tools.Decode as D
import Internal.Tools.Encode as E
import Internal.Config.Text as Text
import Internal.Tools.Json as Json
import Internal.Tools.Timestamp as Timestamp exposing (Timestamp)
import Json.Decode as D
import Internal.Values.User as User exposing (User)
import Json.Encode as E
{-| The Event type occurs everywhere on a user's timeline.
-}
type alias Event =
{ content : E.Value
{ content : Json.Value
, eventId : String
, originServerTs : Timestamp
, roomId : String
, sender : String
, sender : User
, stateKey : Maybe String
, eventType : String
, unsigned : Maybe UnsignedData
@ -54,7 +59,7 @@ helper functions.
type UnsignedData
= UnsignedData
{ age : Maybe Int
, prevContent : Maybe E.Value
, prevContent : Maybe Json.Value
, redactedBecause : Maybe Event
, transactionId : Maybe String
}
@ -67,66 +72,154 @@ age event =
Maybe.andThen (\(UnsignedData data) -> data.age) event.unsigned
{-| Define how an Event can be encoded to and decoded from a JSON object.
-}
coder : Json.Coder Event
coder =
Json.object8
{ name = Text.docs.event.name
, description = Text.docs.event.description
, init = Event
}
(Json.field.required
{ fieldName = "content"
, toField = .content
, description = Text.fields.event.content
, coder = Json.value
}
)
(Json.field.required
{ fieldName = "eventId"
, toField = .eventId
, description = Text.fields.event.eventId
, coder = Json.string
}
)
(Json.field.required
{ fieldName = "originServerTs"
, toField = .originServerTs
, description = Text.fields.event.originServerTs
, coder = Timestamp.coder
}
)
(Json.field.required
{ fieldName = "roomId"
, toField = .roomId
, description = Text.fields.event.roomId
, coder = Json.string
}
)
(Json.field.required
{ fieldName = "sender"
, toField = .sender
, description = Text.fields.event.sender
, coder = User.coder
}
)
(Json.field.optional.value
{ fieldName = "stateKey"
, toField = .stateKey
, description = Text.fields.event.stateKey
, coder = Json.string
}
)
(Json.field.required
-- NOTE! | In JSON we call it `type`, not `eventType`,
-- NOTE! | so that the data is easier to read for other non-Elm
-- NOTE! | JSON parsers
{ fieldName = "type"
, toField = .eventType
, description = Text.fields.event.eventType
, coder = Json.string
}
)
(Json.field.optional.value
{ fieldName = "unsigned"
, toField = .unsigned
, description = Text.fields.event.unsigned
, coder = unsignedCoder
}
)
{-| Decode an Event from a JSON value.
-}
decoder : D.Decoder Event
decoder : Json.Decoder Event
decoder =
D.map8 Event
(D.field "content" D.value)
(D.field "eventId" D.string)
(D.field "originServerTs" Timestamp.decoder)
(D.field "roomId" D.string)
(D.field "sender" D.string)
(D.opField "stateKey" D.string)
(D.field "eventType" D.string)
(D.opField "unsigned" decoderUnsignedData)
{-| Decode Unsigned Data from a JSON value.
-}
decoderUnsignedData : D.Decoder UnsignedData
decoderUnsignedData =
D.map4 (\a b c d -> UnsignedData { age = a, prevContent = b, redactedBecause = c, transactionId = d })
(D.opField "age" D.int)
(D.opField "prevContent" D.value)
(D.opField "redactedBecause" (D.lazy (\_ -> decoder)))
(D.opField "transactionId" D.string)
Json.decode coder
{-| Encode an Event into a JSON value.
-}
encode : Event -> E.Value
encode event =
E.maybeObject
[ ( "content", Just event.content )
, ( "eventId", Just <| E.string event.eventId )
, ( "originServerTs", Just <| Timestamp.encode event.originServerTs )
, ( "roomId", Just <| E.string event.roomId )
, ( "sender", Just <| E.string event.sender )
, ( "stateKey", Maybe.map E.string event.stateKey )
, ( "eventType", Just <| E.string event.eventType )
, ( "unsigned", Maybe.map encodeUnsignedData event.unsigned )
, ( "version", Just <| E.string Default.currentVersion )
]
encode : Json.Encoder Event
encode =
Json.encode coder
{-| Encode Unsigned Data into a JSON value.
{-| Compare two events and determine whether they're identical. Used mostly for
testing purposes.
-}
encodeUnsignedData : UnsignedData -> E.Value
encodeUnsignedData (UnsignedData data) =
E.maybeObject
[ ( "age", Maybe.map E.int data.age )
, ( "prevContent", data.prevContent )
, ( "redactedBecause", Maybe.map encode data.redactedBecause )
, ( "transactionId", Maybe.map E.string data.transactionId )
]
isEqual : Event -> Event -> Bool
isEqual e1 e2 =
if e1.eventId /= e2.eventId then
False
else if e1.originServerTs /= e2.originServerTs then
False
else if e1.roomId /= e2.roomId then
False
else if e1.sender /= e2.sender then
False
else if e1.stateKey /= e2.stateKey then
False
else if e1.eventType /= e2.eventType then
False
else
case ( e1.unsigned, e2.unsigned ) of
( Nothing, Nothing ) ->
True
( Just _, Nothing ) ->
False
( Nothing, Just _ ) ->
False
( Just (UnsignedData d1), Just (UnsignedData d2) ) ->
if d1.age /= d2.age then
False
else if d1.transactionId /= d2.transactionId then
False
else if Maybe.map (E.encode 0) d1.prevContent /= Maybe.map (E.encode 0) d2.prevContent then
False
else
case ( d1.redactedBecause, d2.redactedBecause ) of
( Nothing, Nothing ) ->
True
( Nothing, Just _ ) ->
False
( Just _, Nothing ) ->
False
( Just se1, Just se2 ) ->
isEqual se1 se2
{-| Determine the previous `content` value for this event. This field is only a
`Just value` if the event is a state event, and the Matrix Vault has permission
to see the previous content.
-}
prevContent : Event -> Maybe E.Value
prevContent : Event -> Maybe Json.Value
prevContent event =
Maybe.andThen (\(UnsignedData data) -> data.prevContent) event.unsigned
@ -145,3 +238,40 @@ display the original transaction id used for the event.
transactionId : Event -> Maybe String
transactionId event =
Maybe.andThen (\(UnsignedData data) -> data.transactionId) event.unsigned
unsignedCoder : Json.Coder UnsignedData
unsignedCoder =
Json.object4
{ name = Text.docs.unsigned.name
, description = Text.docs.unsigned.description
, init = \a b c d -> UnsignedData { age = a, prevContent = b, redactedBecause = c, transactionId = d }
}
(Json.field.optional.value
{ fieldName = "age"
, toField = \(UnsignedData data) -> data.age
, description = Text.fields.unsigned.age
, coder = Json.int
}
)
(Json.field.optional.value
{ fieldName = "prevContent"
, toField = \(UnsignedData data) -> data.prevContent
, description = Text.fields.unsigned.prevContent
, coder = Json.value
}
)
(Json.field.optional.value
{ fieldName = "redactedBecause"
, toField = \(UnsignedData data) -> data.redactedBecause
, description = Text.fields.unsigned.redactedBecause
, coder = Json.lazy (\_ -> coder)
}
)
(Json.field.optional.value
{ fieldName = "transactionId"
, toField = \(UnsignedData data) -> data.transactionId
, description = Text.fields.unsigned.transactionId
, coder = Json.string
}
)

View File

@ -1,6 +1,6 @@
module Internal.Values.Settings exposing
( Settings, init
, encode, decoder
, coder, encode, decoder
)
{-|
@ -16,15 +16,13 @@ data types.
## JSON coders
@docs encode, decoder
@docs coder, encode, decoder
-}
import Internal.Config.Default as Default
import Internal.Tools.Decode as D
import Internal.Tools.Encode as E
import Json.Decode as D
import Json.Encode as E
import Internal.Config.Text as Text
import Internal.Tools.Json as Json
{-| Custom settings that can be manipulated by the user. These serve as a
@ -41,46 +39,56 @@ type alias Settings =
}
{-| Define how a Settings type can be encoded to and decoded from a JSON object.
-}
coder : Json.Coder Settings
coder =
Json.object3
{ name = Text.docs.settings.name
, description = Text.docs.settings.description
, init = Settings
}
(Json.field.optional.withDefault
{ fieldName = "currentVersion"
, toField = .currentVersion
, description = Text.fields.settings.currentVersion
, coder = Json.string
, default = Tuple.pair Default.currentVersion []
, defaultToString = identity
}
)
(Json.field.optional.withDefault
{ fieldName = "deviceName"
, toField = .deviceName
, description = Text.fields.settings.deviceName
, coder = Json.string
, default = Tuple.pair Default.deviceName []
, defaultToString = identity
}
)
(Json.field.optional.withDefault
{ fieldName = "syncTime"
, toField = .syncTime
, description = Text.fields.settings.syncTime
, coder = Json.int
, default = Tuple.pair Default.syncTime []
, defaultToString = String.fromInt
}
)
{-| Decode settings from a JSON value.
-}
decoder : D.Decoder Settings
decoder : Json.Decoder Settings
decoder =
D.map3 Settings
(D.opFieldWithDefault "currentVersion" Default.currentVersion D.string)
(D.opFieldWithDefault "deviceName" Default.deviceName D.string)
(D.opFieldWithDefault "syncTime" Default.syncTime D.int)
Json.decode coder
{-| Encode the settings into a JSON value.
-}
encode : Settings -> E.Value
encode settings =
let
differentFrom : b -> b -> Maybe b
differentFrom defaultValue currentValue =
if currentValue == defaultValue then
Nothing
else
Just currentValue
in
E.maybeObject
[ ( "currentVersion"
, settings.currentVersion
|> differentFrom Default.currentVersion
|> Maybe.map E.string
)
, ( "deviceName"
, settings.deviceName
|> differentFrom Default.deviceName
|> Maybe.map E.string
)
, ( "syncTime"
, settings.syncTime
|> differentFrom Default.syncTime
|> Maybe.map E.int
)
]
encode : Json.Encoder Settings
encode =
Json.encode coder
{-| Create a new Settings module based on default values

View File

@ -3,7 +3,7 @@ module Internal.Values.StateManager exposing
, empty, singleton, insert, remove, append
, isEmpty, member, memberKey, get, size, isEqual
, keys, values, fromList, toList
, encode, decoder
, coder, encode, decoder
)
{-| The StateManager tracks the room state based on events, their event types
@ -34,15 +34,15 @@ dictionary-like experience to navigate through the Matrix room state.
## JSON coders
@docs encode, decoder
@docs coder, encode, decoder
-}
import FastDict as Dict exposing (Dict)
import Internal.Config.Text as Text
import Internal.Tools.Json as Json
import Internal.Tools.Mashdict as Mashdict exposing (Mashdict)
import Internal.Values.Event as Event exposing (Event)
import Json.Decode as D
import Json.Encode as E
{-| The StateManager manages the room state by gathering events and looking at
@ -93,15 +93,26 @@ cleanKey key (StateManager manager) =
|> StateManager
{-| Define how a StateManager can be encoded to and decoded from a JSON object.
-}
coder : Json.Coder StateManager
coder =
Event.coder
|> Mashdict.coder .stateKey
|> Json.fastDict
|> Json.map
{ name = Text.docs.stateManager.name
, description = Text.docs.stateManager.description
, forth = StateManager
, back = \(StateManager manager) -> manager
}
{-| Decode a StateManager from a JSON value.
-}
decoder : D.Decoder StateManager
decoder : Json.Decoder StateManager
decoder =
Event.decoder
|> Mashdict.decoder .stateKey
|> D.keyValuePairs
|> D.map Dict.fromList
|> D.map StateManager
Json.decode coder
{-| Create an empty StateManager.
@ -113,11 +124,9 @@ empty =
{-| Encode a StateManager into a JSON value.
-}
encode : StateManager -> E.Value
encode (StateManager manager) =
manager
|> Dict.toCoreDict
|> E.dict identity (Mashdict.encode Event.encode)
encode : Json.Encoder StateManager
encode =
Json.encode coder
{-| Build a StateManager using a list of events.

View File

@ -0,0 +1,707 @@
module Internal.Values.Timeline exposing
( Batch, Timeline
, empty, singleton
, mostRecentEvents, mostRecentEventsFrom
, addSync, insert
, coder, encode, decoder
)
{-|
# Timeline
The Timeline data type represents a timeline in the Matrix room. The Matrix room
timeline is quite a complex data type, as it is constantly only partially known
by the Matrix client. This module exposes a data type that helps explore, track
and maintain this room state.
This design of the timeline uses the batches as waypoints to maintain an order.
The Matrix API often returns batches that have the following four pieces of
information:
1. A list of events.
2. A filter for which all of the events meet the criteria.
3. An end batch token.
4. _(Optional)_ A start batch token. If it is not provided, it is the start of
the timeline.
Here's an example of such a timeline batch:
|-->[■]->[■]->[●]->[■]->[■]->[●]-->|
| |
|<-- filter: only and , no -->|
| |
start: end:
<token_1> <token_2>
When the Matrix API later returns a batch token that starts with `<token_2>`,
we know that we can connect it to the batch above and make a longer list of
events!
## Batch
@docs Batch, Timeline
## Create
@docs empty, singleton
## Query
@docs mostRecentEvents, mostRecentEventsFrom
## Manipulate
@docs addSync, insert
## JSON coder
@docs coder, encode, decoder
-}
import FastDict as Dict exposing (Dict)
import Internal.Config.Text as Text
import Internal.Filter.Timeline as Filter exposing (Filter)
import Internal.Tools.Hashdict as Hashdict exposing (Hashdict)
import Internal.Tools.Iddict as Iddict exposing (Iddict)
import Internal.Tools.Json as Json
import Recursion
import Recursion.Traverse
import Set exposing (Set)
{-| A batch is a batch of events that is placed onto the Timeline. Functions
that require an insertion, generally require this data type.
If the `start` value is `Nothing`, it is either the start of the timeline or the
start of the timeline part that the user is allowed to view.
-}
type alias Batch =
{ events : List String
, filter : Filter
, start : Maybe TokenValue
, end : TokenValue
}
{-| Internal batch that's being saved by the Timeline to track a list of events.
-}
type alias IBatch =
{ events : List String
, filter : Filter
, start : ITokenPTR
, end : ITokenPTR
}
{-| Pointer to an IBatch in the Timeline.
-}
type IBatchPTR
= IBatchPTR IBatchPTRValue
{-| Location indicator of an IBatch in the Timeline.
-}
type alias IBatchPTRValue =
Int
{-| Internal token value that's being stored by the Timeline.
If name is `Nothing`, it indicates the start of the timeline.
-}
type alias IToken =
{ name : TokenValue
, starts : Set IBatchPTRValue -- This itoken starts the following batches
, ends : Set IBatchPTRValue -- This itoken ends the following batches
, inFrontOf : Set ITokenPTRValue -- This itoken is in front of the following tokens
, behind : Set ITokenPTRValue -- This itoken is behind the following tokens
}
{-| Pointer to an IToken in the Timeline.
-}
type ITokenPTR
= ITokenPTR ITokenPTRValue
| StartOfTimeline
{-| Location indicator of an IToken in the Timeline.
-}
type alias ITokenPTRValue =
String
{-| The Timeline type represents the timeline state in a Matrix room.
Following the description of the Matrix spec, a timeline contains the following
items:
- Events that indicate timeline events
- Batch values that can be used to paginate through the timeline
The topological shape of the timeline makes older API responses somewhat
unreliable - as a result,
-}
type Timeline
= Timeline
{ batches : Iddict IBatch
, events : Dict String ( IBatchPTR, List IBatchPTR )
, filledBatches : Int
, mostRecentBatch : ITokenPTR
, tokens : Hashdict IToken
}
{-| Opaque token value sent by the Matrix API
-}
type alias TokenValue =
String
{-| Add a new batch as a sync
-}
addSync : Batch -> Timeline -> Timeline
addSync batch timeline =
case insertBatch batch timeline of
( Timeline t, { start, end } ) ->
let
old : ITokenPTR
old =
t.mostRecentBatch
in
case Timeline { t | mostRecentBatch = end } of
tl ->
if old == start then
tl
else
connectITokenToIToken old start tl
{-| Define how a Timeline can be encoded and decoded to and from a JSON value.
-}
coder : Json.Coder Timeline
coder =
Json.object5
{ name = Text.docs.timeline.name
, description = Text.docs.timeline.description
, init =
\a b c d e ->
Timeline
{ batches = a
, events = b
, filledBatches = c
, mostRecentBatch = d
, tokens = e
}
}
(Json.field.required
{ fieldName = "batches"
, toField = \(Timeline t) -> t.batches
, description = Text.fields.timeline.batches
, coder = Iddict.coder coderIBatch
}
)
(Json.field.required
{ fieldName = "events"
, toField = \(Timeline t) -> t.events
, description = Text.fields.timeline.events
, coder = Json.fastDict (Json.listWithOne coderIBatchPTR)
}
)
(Json.field.optional.withDefault
{ fieldName = "filledBatches"
, toField = \(Timeline t) -> t.filledBatches
, description = Text.fields.timeline.filledBatches
, coder = Json.int
, default = ( 0, [] )
, defaultToString = String.fromInt
}
)
(Json.field.required
{ fieldName = "mostRecentBatch"
, toField = \(Timeline t) -> t.mostRecentBatch
, description = Text.fields.timeline.mostRecentBatch
, coder = coderITokenPTR
}
)
(Json.field.required
{ fieldName = "tokens"
, toField = \(Timeline t) -> t.tokens
, description = Text.fields.timeline.tokens
, coder = Hashdict.coder .name coderIToken
}
)
{-| Define how to encode and decode a IBatch to and from a JSON value.
-}
coderIBatch : Json.Coder IBatch
coderIBatch =
Json.object4
{ name = Text.docs.ibatch.name
, description = Text.docs.ibatch.description
, init = IBatch
}
(Json.field.required
{ fieldName = "events"
, toField = .events
, description = Text.fields.ibatch.events
, coder = Json.list Json.string
}
)
(Json.field.required
{ fieldName = "filter"
, toField = .filter
, description = Text.fields.ibatch.filter
, coder = Filter.coder
}
)
(Json.field.required
{ fieldName = "start"
, toField = .start
, description = Text.fields.ibatch.start
, coder = coderITokenPTR
}
)
(Json.field.required
{ fieldName = "end"
, toField = .end
, description = Text.fields.ibatch.end
, coder = coderITokenPTR
}
)
{-| Define how to encode and decode a IBatchPTR to and from a JSON value.
-}
coderIBatchPTR : Json.Coder IBatchPTR
coderIBatchPTR =
Json.map
{ name = Text.docs.itoken.name
, description = Text.docs.itoken.description
, back = \(IBatchPTR value) -> value
, forth = IBatchPTR
}
coderIBatchPTRValue
{-| Define how to encode and decode a IBatchPTRValue to and from a JSON value.
-}
coderIBatchPTRValue : Json.Coder IBatchPTRValue
coderIBatchPTRValue =
Json.int
{-| Define how to encode and decode a IToken to and from a JSON value.
-}
coderIToken : Json.Coder IToken
coderIToken =
Json.object5
{ name = Text.docs.itoken.name
, description = Text.docs.itoken.description
, init = IToken
}
(Json.field.required
{ fieldName = "name"
, toField = .name
, description = Text.fields.itoken.name
, coder = coderTokenValue
}
)
(Json.field.optional.withDefault
{ fieldName = "starts"
, toField = .starts
, description = Text.fields.itoken.starts
, coder = Json.set coderIBatchPTRValue
, default = ( Set.empty, [] )
, defaultToString = always "[]"
}
)
(Json.field.optional.withDefault
{ fieldName = "ends"
, toField = .ends
, description = Text.fields.itoken.ends
, coder = Json.set coderIBatchPTRValue
, default = ( Set.empty, [] )
, defaultToString = always "[]"
}
)
(Json.field.optional.withDefault
{ fieldName = "inFrontOf"
, toField = .inFrontOf
, description = Text.fields.itoken.inFrontOf
, coder = Json.set coderITokenPTRValue
, default = ( Set.empty, [] )
, defaultToString = always "[]"
}
)
(Json.field.optional.withDefault
{ fieldName = "behind"
, toField = .behind
, description = Text.fields.itoken.behind
, coder = Json.set coderITokenPTRValue
, default = ( Set.empty, [] )
, defaultToString = always "[]"
}
)
{-| Define how to encode and decode a ITokenPTR to and from a JSON value.
-}
coderITokenPTR : Json.Coder ITokenPTR
coderITokenPTR =
Json.maybe coderITokenPTRValue
|> Json.map
{ name = Text.mappings.itokenPTR.name
, description = Text.mappings.itokenPTR.description
, back =
\itokenptr ->
case itokenptr of
ITokenPTR name ->
Just name
StartOfTimeline ->
Nothing
, forth =
\value ->
case value of
Just name ->
ITokenPTR name
Nothing ->
StartOfTimeline
}
{-| Define how to encode and decode a ITokenPTRValue to and from a JSON value.
-}
coderITokenPTRValue : Json.Coder ITokenPTRValue
coderITokenPTRValue =
Json.string
{-| Define how to encode and decode a TokenValue to and from a JSON value.
-}
coderTokenValue : Json.Coder TokenValue
coderTokenValue =
Json.string
{-| Append a token at the end of a batch.
-}
connectIBatchToIToken : IBatchPTR -> ITokenPTR -> Timeline -> Timeline
connectIBatchToIToken (IBatchPTR bptr) pointer (Timeline tl) =
case pointer of
StartOfTimeline ->
Timeline tl
ITokenPTR tptr ->
Timeline
{ tl
| batches =
Iddict.map bptr
(\batch -> { batch | end = pointer })
tl.batches
, tokens =
Hashdict.map tptr
(\token -> { token | ends = Set.insert bptr token.ends })
tl.tokens
}
{-| Append a token at the start of a batch.
-}
connectITokenToIBatch : ITokenPTR -> IBatchPTR -> Timeline -> Timeline
connectITokenToIBatch pointer (IBatchPTR bptr) (Timeline tl) =
case pointer of
StartOfTimeline ->
Timeline tl
ITokenPTR tptr ->
Timeline
{ tl
| tokens =
Hashdict.map tptr
(\token -> { token | starts = Set.insert bptr token.starts })
tl.tokens
, batches =
Iddict.map bptr
(\batch -> { batch | start = pointer })
tl.batches
}
{-| Connect two tokens to each other, revealing their relative location.
-}
connectITokenToIToken : ITokenPTR -> ITokenPTR -> Timeline -> Timeline
connectITokenToIToken pointer1 pointer2 (Timeline tl) =
case ( pointer1, pointer2 ) of
( ITokenPTR early, ITokenPTR late ) ->
if early == late then
Timeline tl
else
Timeline
{ tl
| tokens =
tl.tokens
|> Hashdict.map early
(\data ->
{ data | behind = Set.insert late data.behind }
)
|> Hashdict.map late
(\data ->
{ data | inFrontOf = Set.insert early data.inFrontOf }
)
}
( _, _ ) ->
Timeline tl
{-| Timeline JSON decoder that helps decode a Timeline from JSON.
-}
decoder : Json.Decoder Timeline
decoder =
Json.decode coder
{-| Create a new empty timeline.
-}
empty : Timeline
empty =
Timeline
{ batches = Iddict.empty
, events = Dict.empty
, filledBatches = 0
, mostRecentBatch = StartOfTimeline
, tokens = Hashdict.empty .name
}
{-| Directly encode a Timeline into a JSON value.
-}
encode : Json.Encoder Timeline
encode =
Json.encode coder
{-| Get an IBatch from the Timeline.
-}
getIBatch : IBatchPTR -> Timeline -> Maybe IBatch
getIBatch (IBatchPTR ptr) (Timeline { batches }) =
Iddict.get ptr batches
{-| Get an IToken from the Timeline.
-}
getITokenFromPTR : ITokenPTR -> Timeline -> Maybe IToken
getITokenFromPTR pointer (Timeline { tokens }) =
case pointer of
ITokenPTR ptr ->
Hashdict.get ptr tokens
StartOfTimeline ->
Nothing
{-| Insert a batch anywhere else in the timeline.
-}
insert : Batch -> Timeline -> Timeline
insert batch timeline =
timeline
|> insertBatch batch
|> Tuple.first
{-| Insert a batch into the timeline.
-}
insertBatch : Batch -> Timeline -> ( Timeline, { start : ITokenPTR, end : ITokenPTR } )
insertBatch batch timeline =
case batch.start of
Just start ->
timeline
|> invokeIToken start
|> Tuple.mapSecond (invokeIToken batch.end)
|> (\( startPTR, ( endPTR, newTimeline ) ) ->
( insertIBatch
{ events = batch.events
, filter = batch.filter
, start = startPTR
, end = endPTR
}
newTimeline
, { start = startPTR, end = endPTR }
)
)
Nothing ->
timeline
|> invokeIToken batch.end
|> (\( endPTR, newTimeline ) ->
( insertIBatch
{ events = batch.events
, filter = batch.filter
, start = StartOfTimeline
, end = endPTR
}
newTimeline
, { start = StartOfTimeline, end = endPTR }
)
)
{-| Insert an internal batch into the timeline, and determine its result.
-}
insertIBatch : IBatch -> Timeline -> Timeline
insertIBatch ibatch (Timeline tl) =
case Iddict.insert ibatch tl.batches of
( batchPTR, newBatches ) ->
{ tl
| batches = newBatches
, events =
List.foldl
(\event dict ->
Dict.update event
(\value ->
case value of
Nothing ->
Just ( IBatchPTR batchPTR, [] )
Just ( head, tail ) ->
Just ( IBatchPTR batchPTR, head :: tail )
)
dict
)
tl.events
ibatch.events
, filledBatches =
if List.isEmpty ibatch.events then
tl.filledBatches
else
tl.filledBatches + 1
}
|> Timeline
|> connectITokenToIBatch ibatch.start (IBatchPTR batchPTR)
|> connectIBatchToIToken (IBatchPTR batchPTR) ibatch.end
{-| Invoke an itoken to guarantee that it exists.
-}
invokeIToken : TokenValue -> Timeline -> ( ITokenPTR, Timeline )
invokeIToken value (Timeline tl) =
( ITokenPTR value
, Timeline
{ tl
| tokens =
case Hashdict.get value tl.tokens of
Just _ ->
tl.tokens
Nothing ->
Hashdict.insert
{ name = value
, starts = Set.empty
, ends = Set.empty
, inFrontOf = Set.empty
, behind = Set.empty
}
tl.tokens
}
)
{-| Under a given filter, find the most recent events.
-}
mostRecentEvents : Filter -> Timeline -> List (List String)
mostRecentEvents filter (Timeline timeline) =
mostRecentFrom filter (Timeline timeline) timeline.mostRecentBatch
{-| Instead of finding the most recent events from the latest sync, users can
also find the most recent events given a token value.
-}
mostRecentEventsFrom : Filter -> ITokenPTRValue -> Timeline -> List (List String)
mostRecentEventsFrom filter tokenName timeline =
mostRecentFrom filter timeline (ITokenPTR tokenName)
{-| Under a given filter, starting from a given ITokenPTR, find the most recent
events.
-}
mostRecentFrom : Filter -> Timeline -> ITokenPTR -> List (List String)
mostRecentFrom filter timeline ptr =
Recursion.runRecursion
(\p ->
case getITokenFromPTR p.ptr timeline of
Nothing ->
Recursion.base []
Just token ->
if Set.member token.name p.visited then
Recursion.base []
else
token.ends
|> Set.toList
|> List.filterMap (\bptrv -> getIBatch (IBatchPTR bptrv) timeline)
|> List.filter (\ibatch -> Filter.subsetOf ibatch.filter filter)
|> Recursion.Traverse.traverseList
(\ibatch ->
Recursion.recurseThen
{ ptr = ibatch.start, visited = Set.insert token.name p.visited }
(\optionalTimelines ->
case optionalTimelines of
[] ->
List.singleton ibatch.events
|> Recursion.base
_ :: _ ->
optionalTimelines
|> List.map
(\outTimeline ->
List.append outTimeline ibatch.events
)
|> Recursion.base
)
)
|> Recursion.map List.concat
)
{ ptr = ptr, visited = Set.empty }
{-| Recount the Timeline's amount of filled batches. Since the Timeline
automatically tracks the count on itself, this is generally exclusively used in
specific scenarios like decoding JSON values.
-}
recountFilledBatches : Timeline -> Timeline
recountFilledBatches (Timeline tl) =
Timeline
{ tl
| filledBatches =
tl.batches
|> Iddict.values
|> List.filter (\v -> v.events /= [])
|> List.length
}
{-| Create a timeline with a single batch inserted. This batch is considered the
most recent batch, as if created by a sync.
-}
singleton : Batch -> Timeline
singleton b =
insert b empty

View File

@ -0,0 +1,106 @@
module Internal.Values.User exposing
( User, toString, fromString
, localpart, domain
, coder
)
{-| The Matrix user is uniquely identified by their identifier. This User type
helps identify and safely handle these strings to transform them into meaningful
data types.
## User
@docs User, toString, fromString
## Divide
Matrix users are identified by their unique ID. In the Matrix API, this is a
string that looks as follows:
@alice:example.org
\---/ \---------/
| |
| |
localpart domain
Since the username is safely parsed, one can get these parts of the username.
@docs localpart, domain
## JSON
@docs coder
-}
import Internal.Config.Log as Log exposing (log)
import Internal.Grammar.ServerName as ServerName
import Internal.Grammar.UserId as UserId
import Internal.Tools.Json as Json
import Parser as P
{-| The Matrix user represents a user across multiple Matrix rooms.
-}
type alias User =
UserId.UserID
{-| Define a method to encode/decode Matrix users.
-}
coder : Json.Coder User
coder =
Json.parser
{ name = "Username"
, p =
P.andThen
(\name ->
P.succeed
( name
, if UserId.isHistorical name then
[ log.warn "Historical user found"
]
else
[]
)
)
UserId.userIdParser
, toString = UserId.toString
}
{-| The domain represents the Matrix homeserver controlling this user. It also
offers other Matrix homeservers an indication of where to look if you wish to
send a message to this user.
-}
domain : User -> String
domain =
.domain >> ServerName.toString
{-| Parse a string and convert it into a User, if formatted properly.
-}
fromString : String -> Maybe User
fromString =
UserId.fromString
{-| The localpart is similar to a username, in the sense that every user has
their own localpart. The localpart is not unique across multiple servers,
however! There can be a user @alice:example.com and a user @alice:example.org in
a room at the same time.
-}
localpart : User -> String
localpart =
.localpart
{-| Convert a user into its unique identifier string value.
-}
toString : User -> String
toString =
UserId.toString

View File

@ -122,9 +122,10 @@ roomId (Event event) =
{-| Determine the fully-qualified ID of the user who sent an event.
-}
sender : Event -> String
sender : Event -> Types.User
sender (Event event) =
Envelope.extract .sender event
Envelope.map .sender event
|> Types.User
{-| Determine an event's state key.

View File

@ -68,4 +68,4 @@ getSyncTime (Vault vault) =
-}
setSyncTime : Int -> Vault -> Vault
setSyncTime time (Vault vault) =
Vault <| Envelope.mapSettings (\s -> { s | syncTime = time }) vault
Vault <| Envelope.mapSettings (\s -> { s | syncTime = max 1 time }) vault

147
src/Matrix/User.elm Normal file
View File

@ -0,0 +1,147 @@
module Matrix.User exposing
( User, toString
, localpart, domain
, get
)
{-| Matrix users are identified by their unique ID. In the Matrix API, this is a
string that looks as follows:
@alice:example.org
\---/ \---------/
| |
| |
localpart domain
Since it is very easy to abuse Matrix user IDs to sneak in arbitrary values,
the Elm SDK parses them and makes sure they are safe. As a result, you might
need this module to get the right information from a user!
## User
@docs User, toString
## Info
Sometimes, you are more interested in the username itself. These functions can
help you decipher, disambiguate and categorize users based on their username.
@docs localpart, domain
## Manipulate
@docs get
-}
import Internal.Values.Envelope as Envelope
import Internal.Values.User as Internal
import Types exposing (User(..))
{-| The User type represents a Matrix user.
It contains information like:
- Their username on Matrix
- The server that hosts their account
- Access tokens needed to talk to the server
It does **NOT** contain information like:
- Their nickname
- Their profile picture
- Your private room with them
You can get all that information by looking it up in the [Vault](Matrix#Vault).
**Note:** Please do not store this user type as a variable in your model! You
should always maintain a single source of truth in Elm, and the User type
contains various credentials and API tokens that might expire if you don't
update them from the Vault.
If you need to remember specific users, you can best compare their identifying
string using [toString](Matrix-User#toString) or you can use
[get](Matrix-User#get) with the Vault to get the user type.
-}
type alias User =
Types.User
{-| The domain is the name of the server that the user connects to. Server names
are case-sensitive, so if the strings are equal, the users are on the same
server!
As a result, you can use the user domain for:
- When multiple users in a room have the same localpart on different servers
- Finding other users from a potentially malicious homeserver
- Counting homeservers in a room
See the following examples:
domain (get vault "@alice:example.org") -- "example.org"
domain (get vault "@bob:127.0.0.1") -- "127.0.0.1"
domain (get vault "@charlie:[2001:db8::]") -- "[2001:db8::]"
-}
domain : User -> String
domain (User user) =
Envelope.extract Internal.domain user
{-| Get a specific user by their unique identifier.
The Vault is needed as an input because the `User` type also stores various
credentials needed to talk to the Matrix API.
get vault "@alice:example.org" -- Just (User "alice" "example.org")
get vault "@bob:127.0.0.1" -- Just (User "bob" "127.0.0.1")
get vault "@charlie:[2001:db8::]" -- Just (User "charlie" "2001:db8::")
get vault "@evil:#mp#ss#bl#.c#m" -- Nothing
get vault "" -- Nothing
-}
get : Types.Vault -> String -> Maybe User
get (Types.Vault vault) username =
Envelope.mapMaybe (\_ -> Internal.fromString username) vault
|> Maybe.map Types.User
{-| The localpart is the user's unique username. Every homeserver has their own
username registry, so you might occasionally find distinct users with the same
localpart.
The localpart is often used as a user's name in a room if they haven't set up
a custom name.
See the following examples:
localpart (get vault "@alice:example.org") -- "alice"
localpart (get vault "@bob:127.0.0.1") -- "bob"
localpart (get vault "@charlie:[2001:db8::]") -- "charlie"
-}
localpart : User -> String
localpart (User user) =
Envelope.extract Internal.localpart user
{-| Get the uniquely identifying string for this user. Since the strings are
case-sensitive, you can run a simple string comparison to compare usernames.
-}
toString : User -> String
toString (User user) =
Envelope.extract Internal.toString user

View File

@ -1,4 +1,4 @@
module Types exposing (Vault(..), Event(..))
module Types exposing (Vault(..), Event(..), User(..))
{-| The Elm SDK uses a lot of records and values that are easy to manipulate.
Yet, the [Elm design guidelines](https://package.elm-lang.org/help/design-guidelines#keep-tags-and-record-constructors-secret)
@ -12,12 +12,13 @@ access their content directly.
The opaque types are placed in a central module so all exposed modules can
safely access all exposed data types without risking to create circular imports.
@docs Vault, Event
@docs Vault, Event, User
-}
import Internal.Values.Envelope as Envelope
import Internal.Values.Event as Event
import Internal.Values.User as User
import Internal.Values.Vault as Vault
@ -27,6 +28,12 @@ type Event
= Event (Envelope.Envelope Event.Event)
{-| Opaque type for Matrix User
-}
type User
= User (Envelope.Envelope User.User)
{-| Opaque type for Matrix Vault
-}
type Vault

View File

@ -0,0 +1,435 @@
module Test.Filter.Timeline exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Filter.Timeline as Filter exposing (Filter)
import Internal.Grammar.UserId as U
import Internal.Values.Event as Event
import Json.Decode as D
import Json.Encode as E
import Set
import Test exposing (..)
import Test.Values.Event as TestEvent
fuzzer : Fuzzer Filter
fuzzer =
Fuzz.map2 Filter.and
(Fuzz.oneOf
[ Fuzz.map Filter.allSendersExcept (Fuzz.list Fuzz.string)
, Fuzz.map Filter.onlySenders (Fuzz.list Fuzz.string)
, Fuzz.constant Filter.pass
]
)
(Fuzz.oneOf
[ Fuzz.map Filter.allTypesExcept (Fuzz.list Fuzz.string)
, Fuzz.map Filter.onlyTypes (Fuzz.list Fuzz.string)
, Fuzz.constant Filter.pass
]
)
suite : Test
suite =
describe "Timeline filter"
[ describe "Tautological equivalences"
[ test "Pass /= fail"
(Filter.pass
|> Expect.notEqual Filter.fail
|> always
)
, test "All senders == pass"
(Filter.allSendersExcept []
|> Expect.equal Filter.pass
|> always
)
, test "All types == pass"
(Filter.allTypesExcept []
|> Expect.equal Filter.pass
|> always
)
, test "No senders == fail"
(Filter.onlySenders []
|> Expect.equal Filter.fail
|> always
)
, test "No types == fail"
(Filter.onlyTypes []
|> Expect.equal Filter.fail
|> always
)
, fuzz2 Fuzz.string
(Fuzz.list Fuzz.string)
"Some types /= some senders"
(\head tail ->
Expect.notEqual
(Filter.onlyTypes (head :: tail))
(Filter.onlySenders (head :: tail))
)
, fuzz2 fuzzer
fuzzer
"Filter.and f1 f2 == pass iff f1 == f2 == pass"
(\filter1 filter2 ->
Expect.equal
(Filter.and filter1 filter2 == Filter.pass)
(filter1 == Filter.pass && filter2 == Filter.pass)
)
]
, describe "Event filters"
[ fuzz TestEvent.fuzzer
"Only event type filter matches"
(\event ->
event
|> Filter.match (Filter.onlyTypes [ event.eventType ])
|> Expect.equal True
)
, fuzz TestEvent.fuzzer
"Only event sender filter matches"
(\event ->
event
|> Filter.match (Filter.onlySenders [ U.toString event.sender ])
|> Expect.equal True
)
, fuzz TestEvent.fuzzer
"Not event type filter doesn't match"
(\event ->
event
|> Filter.match (Filter.allTypesExcept [ event.eventType ])
|> Expect.equal False
)
, fuzz TestEvent.fuzzer
"Not event sender filter doesn't match"
(\event ->
event
|> Filter.match (Filter.allSendersExcept [ U.toString event.sender ])
|> Expect.equal False
)
, fuzz2 TestEvent.fuzzer
(Fuzz.list Fuzz.string)
"Only matches when in sender list"
(\event senders ->
event
|> Filter.match (Filter.onlySenders senders)
|> Expect.equal (List.member (U.toString event.sender) senders)
)
, fuzz2 TestEvent.fuzzer
(Fuzz.list Fuzz.string)
"Only matches when in type list"
(\event types ->
event
|> Filter.match (Filter.onlyTypes types)
|> Expect.equal (List.member event.eventType types)
)
, fuzz2 TestEvent.fuzzer
(Fuzz.list Fuzz.string)
"All except doesn't match when in sender list"
(\event senders ->
event
|> Filter.match (Filter.allSendersExcept senders)
|> Expect.notEqual (List.member (U.toString event.sender) senders)
)
, fuzz2 TestEvent.fuzzer
(Fuzz.list Fuzz.string)
"All except doesn't match when in type list"
(\event types ->
event
|> Filter.match (Filter.allTypesExcept types)
|> Expect.notEqual (List.member event.eventType types)
)
, fuzz (Fuzz.list Fuzz.string)
"Only list AND all except list = fail senders"
(\senders ->
Filter.onlySenders senders
|> Filter.and (Filter.allSendersExcept senders)
|> Expect.equal Filter.fail
)
, fuzz (Fuzz.list Fuzz.string)
"Only list AND all except list = fail types"
(\types ->
Filter.onlyTypes types
|> Filter.and (Filter.allTypesExcept types)
|> Expect.equal Filter.fail
)
, fuzz2 (Fuzz.list Fuzz.string)
(Fuzz.list Fuzz.string)
"Only list + all except list = common types"
(\t1 t2 ->
Expect.equal
(Filter.and
(Filter.onlyTypes t1)
(Filter.allTypesExcept t2)
)
(Set.diff (Set.fromList t1) (Set.fromList t2)
|> Set.toList
|> Filter.onlyTypes
)
)
, fuzz2 (Fuzz.list Fuzz.string)
(Fuzz.list Fuzz.string)
"Only list + all except list = common senders"
(\t1 t2 ->
Expect.equal
(Filter.and
(Filter.onlySenders t1)
(Filter.allSendersExcept t2)
)
(Set.diff (Set.fromList t1) (Set.fromList t2)
|> Set.toList
|> Filter.onlySenders
)
)
]
, describe "Subset testing"
[ fuzz2 fuzzer
fuzzer
"Combining two filters is always a subset"
(\filter1 filter2 ->
filter1
|> Filter.and filter2
|> Expect.all
[ Filter.subsetOf filter1 >> Expect.equal True
, Filter.subsetOf filter2 >> Expect.equal True
]
)
, fuzz
(Fuzz.bool
|> Fuzz.andThen
(\same ->
if same then
Fuzz.map (\a -> ( a, a )) fuzzer
else
Fuzz.map2 Tuple.pair fuzzer fuzzer
)
)
"subset goes both way iff equal"
(\( filter1, filter2 ) ->
Expect.equal
(filter1 == filter2)
(Filter.subsetOf filter1 filter2
&& Filter.subsetOf filter2 filter1
)
)
, fuzz2 Fuzz.string
(Fuzz.list Fuzz.string)
"One more excluded sender is a subset"
(\head tail ->
Filter.allSendersExcept (head :: tail)
|> Filter.subsetOf (Filter.allSendersExcept tail)
|> Expect.equal True
)
, fuzz2 Fuzz.string
(Fuzz.list Fuzz.string)
"One more excluded type is a subset"
(\head tail ->
Filter.allTypesExcept (head :: tail)
|> Filter.subsetOf (Filter.allTypesExcept tail)
|> Expect.equal True
)
, fuzz2 Fuzz.string
(Fuzz.list Fuzz.string)
"One less included sender is a subset"
(\head tail ->
Filter.onlySenders tail
|> Filter.subsetOf (Filter.onlySenders (head :: tail))
|> Expect.equal True
)
, fuzz2 Fuzz.string
(Fuzz.list Fuzz.string)
"One less included type is a subset"
(\head tail ->
Filter.onlyTypes tail
|> Filter.subsetOf (Filter.onlyTypes (head :: tail))
|> Expect.equal True
)
, fuzz3 Fuzz.string
(Fuzz.list Fuzz.string)
fuzzer
"One more excluded sender is a subset - even when combined with another fuzzer"
(\head tail filter ->
Filter.allSendersExcept (head :: tail)
|> Filter.and filter
|> Filter.subsetOf (Filter.and filter <| Filter.allSendersExcept tail)
|> Expect.equal True
)
, fuzz3 Fuzz.string
(Fuzz.list Fuzz.string)
fuzzer
"One more excluded type is a subset - even when combined with another fuzzer"
(\head tail filter ->
Filter.allTypesExcept (head :: tail)
|> Filter.and filter
|> Filter.subsetOf (Filter.and filter <| Filter.allTypesExcept tail)
|> Expect.equal True
)
, fuzz3 Fuzz.string
(Fuzz.list Fuzz.string)
fuzzer
"One less included sender is a subset - even when combined with another fuzzer"
(\head tail filter ->
Filter.onlySenders tail
|> Filter.and filter
|> Filter.subsetOf (Filter.and filter <| Filter.onlySenders (head :: tail))
|> Expect.equal True
)
, fuzz3 Fuzz.string
(Fuzz.list Fuzz.string)
fuzzer
"One less included type is a subset - even when combined with another fuzzer"
(\head tail filter ->
Filter.onlyTypes tail
|> Filter.and filter
|> Filter.subsetOf (Filter.and filter <| Filter.onlyTypes (head :: tail))
|> Expect.equal True
)
]
, describe "Use case testing"
[ fuzz3 (Fuzz.list TestEvent.fuzzer)
(Fuzz.list Fuzz.string)
(Fuzz.list Fuzz.string)
"Only senders + only type"
(\events senders types ->
let
l1 : List Event.Event
l1 =
events
|> Filter.run
(Filter.and
(Filter.onlySenders senders)
(Filter.onlyTypes types)
)
l2 : List Event.Event
l2 =
List.filter
(\e ->
List.member (U.toString e.sender) senders
&& List.member e.eventType types
)
events
in
Expect.all
[ Expect.equal (List.length l1) (List.length l2)
|> always
, List.map2 Event.isEqual l1 l2
|> List.all identity
|> Expect.equal True
|> always
]
()
)
, fuzz3 (Fuzz.list TestEvent.fuzzer)
(Fuzz.list Fuzz.string)
(Fuzz.list Fuzz.string)
"Only senders + all except type"
(\events senders types ->
let
l1 : List Event.Event
l1 =
events
|> Filter.run
(Filter.and
(Filter.onlySenders senders)
(Filter.allTypesExcept types)
)
l2 : List Event.Event
l2 =
List.filter
(\e ->
List.member (U.toString e.sender) senders
&& (not <| List.member (U.toString e.sender) types)
)
events
in
Expect.all
[ Expect.equal (List.length l1) (List.length l2)
|> always
, List.map2 Event.isEqual l1 l2
|> List.all identity
|> Expect.equal True
|> always
]
()
)
, fuzz3 (Fuzz.list TestEvent.fuzzer)
(Fuzz.list Fuzz.string)
(Fuzz.list Fuzz.string)
"All except senders + only type"
(\events senders types ->
let
l1 : List Event.Event
l1 =
events
|> Filter.run
(Filter.and
(Filter.allSendersExcept senders)
(Filter.onlyTypes types)
)
l2 : List Event.Event
l2 =
List.filter
(\e ->
(not <| List.member (U.toString e.sender) senders)
&& List.member e.eventType types
)
events
in
Expect.all
[ Expect.equal (List.length l1) (List.length l2)
|> always
, List.map2 Event.isEqual l1 l2
|> List.all identity
|> Expect.equal True
|> always
]
()
)
, fuzz3 (Fuzz.list TestEvent.fuzzer)
(Fuzz.list Fuzz.string)
(Fuzz.list Fuzz.string)
"All except senders + all except type"
(\events senders types ->
let
l1 : List Event.Event
l1 =
events
|> Filter.run
(Filter.and
(Filter.allSendersExcept senders)
(Filter.allTypesExcept types)
)
l2 : List Event.Event
l2 =
List.filter
(\e ->
(not <| List.member (U.toString e.sender) senders)
&& (not <| List.member e.eventType types)
)
events
in
Expect.all
[ Expect.equal (List.length l1) (List.length l2)
|> always
, List.map2 Event.isEqual l1 l2
|> List.all identity
|> Expect.equal True
|> always
]
()
)
]
, describe "JSON"
[ fuzz fuzzer
"encode -> decode is the same"
(\filter ->
filter
|> Filter.encode
|> E.encode 0
|> D.decodeString Filter.decoder
|> Expect.equal (Ok ( filter, [] ))
)
]
]

View File

@ -0,0 +1,126 @@
module Test.Grammar.ServerName exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Grammar.ServerName as SN
import Test exposing (..)
dnsFuzzer : Fuzzer String
dnsFuzzer =
Fuzz.map2
(\head tail ->
String.fromList (head :: tail)
)
("abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789"
|> String.toList
|> Fuzz.oneOfValues
)
("abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789-."
|> String.toList
|> Fuzz.oneOfValues
|> Fuzz.listOfLengthBetween 0 (255 - 1)
)
hostnameFuzzer : Fuzzer String
hostnameFuzzer =
Fuzz.oneOf
[ dnsFuzzer
, ipv4Fuzzer
, Fuzz.map (\x -> "[" ++ x ++ "]") ipv6Fuzzer
]
ipv4Fuzzer : Fuzzer String
ipv4Fuzzer =
Fuzz.intRange 0 255
|> Fuzz.listOfLength 4
|> Fuzz.map
(List.map String.fromInt
>> List.intersperse "."
>> String.concat
)
ipv6Fuzzer : Fuzzer String
ipv6Fuzzer =
let
num : Fuzzer String
num =
"0123456789abcdefABCDEF"
|> String.toList
|> Fuzz.oneOfValues
|> Fuzz.listOfLength 4
|> Fuzz.map String.fromList
in
Fuzz.oneOf
[ Fuzz.listOfLength 8 num
|> Fuzz.map (List.intersperse ":")
|> Fuzz.map String.concat
, Fuzz.listOfLengthBetween 0 7 num
|> Fuzz.andThen
(\front ->
num
|> Fuzz.listOfLengthBetween 0 (8 - 1 - List.length front)
|> Fuzz.map
(\back ->
[ front
|> List.intersperse ":"
, [ "::" ]
, back
|> List.intersperse ":"
]
|> List.concat
|> String.concat
)
)
]
portFuzzer : Fuzzer String
portFuzzer =
Fuzz.oneOf
[ Fuzz.constant ""
, Fuzz.intRange 0 65535
|> Fuzz.map (\p -> ":" ++ String.fromInt p)
]
serverNameFuzzer : Fuzzer String
serverNameFuzzer =
Fuzz.map2 (++) hostnameFuzzer portFuzzer
suite : Test
suite =
describe "Server name tests"
[ describe "Checking correct values"
[ fuzz serverNameFuzzer
"Correct server names validate"
(\server ->
SN.fromString server
|> Maybe.map SN.toString
|> Expect.equal (Just server)
)
, test "Checking spec examples"
(\() ->
let
examples : List String
examples =
[ "matrix.org"
, "matrix.org:8888"
, "1.2.3.4"
, "1.2.3.4:1234"
, "[1234:5678::abcd]"
, "[1234:5678::abcd]:5678"
]
in
examples
|> List.map SN.fromString
|> List.map ((/=) Nothing)
|> Expect.equalLists
(List.repeat (List.length examples) True)
)
]
]

View File

@ -0,0 +1,159 @@
module Test.Grammar.UserId exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Grammar.ServerName as SN
import Internal.Grammar.UserId as U
import Test exposing (..)
import Test.Grammar.ServerName as ServerName
modernUserCharFuzzer : Fuzzer Char
modernUserCharFuzzer =
Fuzz.oneOf
[ Fuzz.intRange 0x61 0x7A
|> Fuzz.map Char.fromCode
, "abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ"
|> String.toList
|> Fuzz.oneOfValues
]
historicalUserCharFuzzer : Fuzzer Char
historicalUserCharFuzzer =
[ ( 0x21, 0x39 ), ( 0x3B, 0x7E ) ]
|> List.map (\( low, high ) -> Fuzz.intRange low high)
|> Fuzz.oneOf
|> Fuzz.map Char.fromCode
modernUserFuzzer : Fuzzer String
modernUserFuzzer =
Fuzz.map2
(\localpart domain ->
let
maxLocalSize : Int
maxLocalSize =
255 - String.length domain - 2
in
localpart
|> List.take maxLocalSize
|> String.fromList
|> (\l -> "@" ++ l ++ ":" ++ domain)
)
(Fuzz.listOfLengthBetween 1 255 modernUserCharFuzzer)
(ServerName.serverNameFuzzer
|> Fuzz.filter
(\name ->
String.length name < 255 - 2
)
)
historicalUserFuzzer : Fuzzer String
historicalUserFuzzer =
Fuzz.map2
(\localpart domain ->
let
maxLocalSize : Int
maxLocalSize =
255 - String.length domain - 2
in
localpart
|> List.take maxLocalSize
|> String.fromList
|> (\l -> "@" ++ l ++ ":" ++ domain)
)
(Fuzz.listOfLengthBetween 1 255 historicalUserCharFuzzer)
(ServerName.serverNameFuzzer
|> Fuzz.filter
(\name ->
String.length name < 255 - 2
)
)
userFuzzer : Fuzzer String
userFuzzer =
Fuzz.oneOf [ modernUserFuzzer, historicalUserFuzzer ]
fullUserFuzzer : Fuzzer U.UserID
fullUserFuzzer =
userFuzzer
|> Fuzz.map U.fromString
|> Fuzz.map (Maybe.withDefault { localpart = "a", domain = { host = SN.DNS "a", port_ = Nothing } })
suite : Test
suite =
describe "UserId"
[ describe "Size"
[ fuzz ServerName.serverNameFuzzer
"Username cannot be length 0"
(\domain ->
"@"
++ ":"
++ domain
|> U.fromString
|> Expect.equal Nothing
)
, fuzz2 (Fuzz.listOfLengthBetween 1 255 historicalUserCharFuzzer)
ServerName.serverNameFuzzer
"Username length cannot exceed 255"
(\localpart domain ->
let
username : String
username =
"@"
++ String.fromList localpart
++ ":"
++ domain
in
Expect.equal
(U.fromString username == Nothing)
(String.length username > 255)
)
, fuzz modernUserFuzzer
"Modern fuzzer has appropriate size"
(String.length >> Expect.lessThan 256)
, fuzz historicalUserFuzzer
"Historical fuzzer has appropriate size"
(String.length >> Expect.lessThan 256)
, fuzz userFuzzer
"User fuzzers have appropriate size"
(String.length >> Expect.lessThan 256)
]
, describe "From string evaluation"
[ fuzz userFuzzer
"fromString always returns a value on fuzzer"
(U.fromString >> Expect.notEqual Nothing)
, fuzz userFuzzer
"fromString -> toString returns the same value"
(\username ->
username
|> U.fromString
|> Maybe.map U.toString
|> Expect.equal (Just username)
)
-- Not always True
-- TODO: Define a fitting fuzzer for this test
-- , fuzz historicalUserFuzzer
-- "Historical users are historical"
-- (\username ->
-- username
-- |> U.fromString
-- |> Maybe.map U.isHistorical
-- |> Expect.equal (Just True)
-- )
, fuzz modernUserFuzzer
"Modern users are not historical"
(\username ->
username
|> U.fromString
|> Maybe.map U.isHistorical
|> Expect.equal (Just False)
)
]
]

View File

@ -27,7 +27,7 @@ settings =
vault
|> Matrix.Settings.setSyncTime sync
|> Matrix.Settings.getSyncTime
|> Expect.equal sync
|> Expect.equal (max 1 sync)
)
]

View File

@ -3,6 +3,7 @@ module Test.Tools.Hashdict exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Tools.Hashdict as Hashdict exposing (Hashdict)
import Internal.Tools.Json as Json
import Internal.Values.Event as Event
import Json.Decode as D
import Json.Encode as E
@ -93,11 +94,11 @@ suite =
"JSON encode -> JSON decode"
(\indent ->
Hashdict.empty identity
|> Hashdict.encode E.string
|> Json.encode (Hashdict.coder identity Json.string)
|> E.encode indent
|> D.decodeString (Hashdict.decoder identity D.string)
|> Result.map (Hashdict.isEqual (Hashdict.empty String.toUpper))
|> Expect.equal (Ok True)
|> D.decodeString (Json.decode <| Hashdict.coder identity Json.string)
|> Result.map (Tuple.mapFirst (Hashdict.isEqual (Hashdict.empty String.toUpper)))
|> Expect.equal (Ok ( True, [] ))
)
]
, describe "singleton"
@ -114,7 +115,7 @@ suite =
(\event ->
Hashdict.singleton .eventId event
|> Hashdict.remove event
|> Hashdict.isEqual (Hashdict.empty .sender)
|> Hashdict.isEqual (Hashdict.empty .roomId)
|> Expect.equal True
)
, fuzz TestEvent.fuzzer
@ -122,7 +123,7 @@ suite =
(\event ->
Hashdict.singleton .eventId event
|> Hashdict.removeKey event.eventId
|> Hashdict.isEqual (Hashdict.empty .sender)
|> Hashdict.isEqual (Hashdict.empty .roomId)
|> Expect.equal True
)
, fuzz TestEvent.fuzzer
@ -164,11 +165,11 @@ suite =
"JSON encode -> JSON decode"
(\hashdict indent ->
hashdict
|> Hashdict.encode Event.encode
|> Json.encode (Hashdict.coder .eventId Event.coder)
|> E.encode indent
|> D.decodeString (Hashdict.decoder .eventId Event.decoder)
|> Result.map Hashdict.toList
|> Expect.equal (Ok <| Hashdict.toList hashdict)
|> D.decodeString (Json.decode <| Hashdict.coder .eventId Event.coder)
|> Result.map (Tuple.first >> Hashdict.toList)
|> Expect.equal (Ok (Hashdict.toList hashdict))
)
]
]

View File

@ -3,6 +3,7 @@ module Test.Tools.Iddict exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Tools.Iddict as Iddict exposing (Iddict)
import Internal.Tools.Json as Json
import Json.Decode as D
import Json.Encode as E
import Test exposing (..)
@ -73,21 +74,23 @@ empty =
)
, test "JSON encode -> decode -> empty"
(Iddict.empty
|> Iddict.encode identity
|> D.decodeValue (Iddict.decoder D.value)
|> Iddict.encode Json.value
|> D.decodeValue (Iddict.decoder Json.value)
|> Result.map Tuple.first
|> Expect.equal (Ok Iddict.empty)
|> always
)
, test "JSON encode"
(Iddict.empty
|> Iddict.encode identity
|> Iddict.encode Json.value
|> E.encode 0
|> Expect.equal "{\"cursor\":0,\"dict\":{}}"
|> Expect.equal "{\"dict\":{}}"
|> always
)
, test "JSON decode"
("{\"cursor\":0,\"dict\":{}}"
|> D.decodeString (Iddict.decoder D.value)
("{\"dict\":{}}"
|> D.decodeString (Iddict.decoder Json.value)
|> Result.map Tuple.first
|> Expect.equal (Ok Iddict.empty)
|> always
)
@ -170,8 +173,9 @@ singleton =
"JSON encode -> decode -> singleton"
(\single ->
single
|> Iddict.encode E.int
|> D.decodeValue (Iddict.decoder D.int)
|> Iddict.encode Json.int
|> D.decodeValue (Iddict.decoder Json.int)
|> Result.map Tuple.first
|> Expect.equal (Ok single)
)
, fuzz Fuzz.int
@ -179,7 +183,7 @@ singleton =
(\i ->
Iddict.singleton i
|> Tuple.second
|> Iddict.encode E.int
|> Iddict.encode Json.int
|> E.encode 0
|> Expect.equal ("{\"cursor\":1,\"dict\":{\"0\":" ++ String.fromInt i ++ "}}")
)
@ -187,7 +191,8 @@ singleton =
"JSON decode"
(\i ->
("{\"cursor\":1,\"dict\":{\"0\":" ++ String.fromInt i ++ "}}")
|> D.decodeString (Iddict.decoder D.int)
|> D.decodeString (Iddict.decoder Json.int)
|> Result.map Tuple.first
|> Tuple.pair 0
|> Expect.equal (Iddict.singleton i |> Tuple.mapSecond Ok)
)

508
tests/Test/Tools/Json.elm Normal file
View File

@ -0,0 +1,508 @@
module Test.Tools.Json exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Tools.Json as Json
import Json.Decode as D
import Json.Encode as E
import Test exposing (..)
type alias Human2 =
{ name : String, age : Maybe Int }
type alias Human3 =
{ name : String, age : Maybe Int, hobbies : List String }
type alias Human4 =
{ name : String
, age : Maybe Int
, hobbies : List String
, weight : Maybe Float
}
type alias Human5 =
{ name : String
, age : Maybe Int
, hobbies : List String
, weight : Maybe Float
, height : Float
}
type alias Human6 =
{ name : String
, age : Maybe Int
, hobbies : List String
, weight : Maybe Float
, height : Float
, invitedToParty : Bool
}
type alias Human7 =
{ name : String
, age : Maybe Int
, hobbies : List String
, weight : Maybe Float
, height : Float
, invitedToParty : Bool
, presentGiven : Maybe String
}
type alias Human8 =
{ name : String
, age : Maybe Int
, hobbies : List String
, weight : Maybe Float
, height : Float
, invitedToParty : Bool
, presentGiven : Maybe String
, grid : List (List Int)
}
type alias MegaHuman =
{ human2 : Human2
, human3 : Human3
, human4 : Human4
, human5 : Human5
, human6 : Human6
, human7 : Human7
, human8 : Human8
}
ageField : Json.Field (Maybe Int) { a | age : Maybe Int }
ageField =
Json.field.optional.value
{ fieldName = "age"
, toField = .age
, description = []
, coder = Json.int
}
ageFuzzer : Fuzzer (Maybe Int)
ageFuzzer =
Fuzz.maybe Fuzz.int
gridField : Json.Field (List (List Int)) { a | grid : List (List Int) }
gridField =
Json.field.optional.withDefault
{ fieldName = "grid"
, toField = .grid
, description = []
, coder = Json.list (Json.list Json.int)
, default = ( [], [] )
, defaultToString = always "[]"
}
gridFuzzer : Fuzzer (List (List Int))
gridFuzzer =
Fuzz.list (Fuzz.list Fuzz.int)
heightField : Json.Field Float { a | height : Float }
heightField =
Json.field.required
{ fieldName = "height"
, toField = .height
, description = []
, coder = Json.float
}
heightFuzzer : Fuzzer Float
heightFuzzer =
Fuzz.niceFloat
hobbiesField : Json.Field (List String) { a | hobbies : List String }
hobbiesField =
Json.field.optional.withDefault
{ fieldName = "hobbies"
, toField = .hobbies
, description = []
, coder = Json.list Json.string
, default = ( [], [] )
, defaultToString = always "[]"
}
hobbiesFuzzer : Fuzzer (List String)
hobbiesFuzzer =
Fuzz.list Fuzz.string
invitedToPartyField : Json.Field Bool { a | invitedToParty : Bool }
invitedToPartyField =
Json.field.optional.withDefault
{ fieldName = "invitedToParty"
, toField = .invitedToParty
, description = []
, coder = Json.bool
, default = ( False, [] )
, defaultToString =
\b ->
if b then
"True"
else
"False"
}
invitedToPartyFuzzer : Fuzzer Bool
invitedToPartyFuzzer =
Fuzz.bool
nameField : Json.Field String { a | name : String }
nameField =
Json.field.required
{ fieldName = "name"
, toField = .name
, description = []
, coder = Json.string
}
nameFuzzer : Fuzzer String
nameFuzzer =
Fuzz.string
presentGivenField : Json.Field (Maybe String) { a | presentGiven : Maybe String }
presentGivenField =
Json.field.required
{ fieldName = "presentGiven"
, toField = .presentGiven
, description = []
, coder = Json.maybe Json.string
}
presentGivenFuzzer : Fuzzer (Maybe String)
presentGivenFuzzer =
Fuzz.maybe Fuzz.string
weightField : Json.Field (Maybe Float) { a | weight : Maybe Float }
weightField =
Json.field.optional.value
{ fieldName = "weight"
, toField = .weight
, description = []
, coder = Json.float
}
weightFuzzer : Fuzzer (Maybe Float)
weightFuzzer =
-- TODO: Maybe make Float not so nice?
Fuzz.maybe Fuzz.niceFloat
human2Coder : Json.Coder Human2
human2Coder =
Json.object2
{ name = "Human2"
, description = []
, init = Human2
}
nameField
ageField
human2Fuzzer : Fuzzer Human2
human2Fuzzer =
Fuzz.map2 Human2
nameFuzzer
ageFuzzer
human3Coder : Json.Coder Human3
human3Coder =
Json.object3
{ name = "Human3"
, description = []
, init = Human3
}
nameField
ageField
hobbiesField
human3Fuzzer : Fuzzer Human3
human3Fuzzer =
Fuzz.map3 Human3
nameFuzzer
ageFuzzer
hobbiesFuzzer
human4Coder : Json.Coder Human4
human4Coder =
Json.object4
{ name = "Human4"
, description = []
, init = Human4
}
nameField
ageField
hobbiesField
weightField
human4Fuzzer : Fuzzer Human4
human4Fuzzer =
Fuzz.map4 Human4
nameFuzzer
ageFuzzer
hobbiesFuzzer
weightFuzzer
human5Coder : Json.Coder Human5
human5Coder =
Json.object5
{ name = "Human5"
, description = []
, init = Human5
}
nameField
ageField
hobbiesField
weightField
heightField
human5Fuzzer : Fuzzer Human5
human5Fuzzer =
Fuzz.map5 Human5
nameFuzzer
ageFuzzer
hobbiesFuzzer
weightFuzzer
heightFuzzer
human6Coder : Json.Coder Human6
human6Coder =
Json.object6
{ name = "Human6"
, description = []
, init = Human6
}
nameField
ageField
hobbiesField
weightField
heightField
invitedToPartyField
human6Fuzzer : Fuzzer Human6
human6Fuzzer =
Fuzz.map6 Human6
nameFuzzer
ageFuzzer
hobbiesFuzzer
weightFuzzer
heightFuzzer
invitedToPartyFuzzer
human7Coder : Json.Coder Human7
human7Coder =
Json.object7
{ name = "Human7"
, description = []
, init = Human7
}
nameField
ageField
hobbiesField
weightField
heightField
invitedToPartyField
presentGivenField
human7Fuzzer : Fuzzer Human7
human7Fuzzer =
Fuzz.map7 Human7
nameFuzzer
ageFuzzer
hobbiesFuzzer
weightFuzzer
heightFuzzer
invitedToPartyFuzzer
presentGivenFuzzer
human8Coder : Json.Coder Human8
human8Coder =
Json.object8
{ name = "Human8"
, description = []
, init = Human8
}
nameField
ageField
hobbiesField
weightField
heightField
invitedToPartyField
presentGivenField
gridField
human8Fuzzer : Fuzzer Human8
human8Fuzzer =
Fuzz.map8 Human8
nameFuzzer
ageFuzzer
hobbiesFuzzer
weightFuzzer
heightFuzzer
invitedToPartyFuzzer
presentGivenFuzzer
gridFuzzer
megaHumanCoder : Json.Coder MegaHuman
megaHumanCoder =
Json.object7
{ name = "MegaHuman"
, description = []
, init = MegaHuman
}
(Json.field.required { fieldName = "h2", toField = .human2, description = [], coder = human2Coder })
(Json.field.required { fieldName = "h3", toField = .human3, description = [], coder = human3Coder })
(Json.field.required { fieldName = "h4", toField = .human4, description = [], coder = human4Coder })
(Json.field.required { fieldName = "h5", toField = .human5, description = [], coder = human5Coder })
(Json.field.required { fieldName = "h6", toField = .human6, description = [], coder = human6Coder })
(Json.field.required { fieldName = "h7", toField = .human7, description = [], coder = human7Coder })
(Json.field.required { fieldName = "h8", toField = .human8, description = [], coder = human8Coder })
megahumanFuzzer : Fuzzer MegaHuman
megahumanFuzzer =
Fuzz.map7 MegaHuman
human2Fuzzer
human3Fuzzer
human4Fuzzer
human5Fuzzer
human6Fuzzer
human7Fuzzer
human8Fuzzer
suite : Test
suite =
describe "JSON module"
[ describe "Human2"
[ fuzz human2Fuzzer
"Recoding succeeds"
(\human ->
human
|> Json.encode human2Coder
|> E.encode 0
|> D.decodeString (Json.decode human2Coder)
|> Result.map Tuple.first
|> Expect.equal (Ok human)
)
]
, describe "Human3"
[ fuzz human3Fuzzer
"Recoding succeeds"
(\human ->
human
|> Json.encode human3Coder
|> E.encode 0
|> D.decodeString (Json.decode human3Coder)
|> Result.map Tuple.first
|> Expect.equal (Ok human)
)
]
, describe "Human4"
[ fuzz human4Fuzzer
"Recoding succeeds"
(\human ->
human
|> Json.encode human4Coder
|> E.encode 0
|> D.decodeString (Json.decode human4Coder)
|> Result.map Tuple.first
|> Expect.equal (Ok human)
)
]
, describe "Human5"
[ fuzz human5Fuzzer
"Recoding succeeds"
(\human ->
human
|> Json.encode human5Coder
|> E.encode 0
|> D.decodeString (Json.decode human5Coder)
|> Result.map Tuple.first
|> Expect.equal (Ok human)
)
]
, describe "Human6"
[ fuzz human6Fuzzer
"Recoding succeeds"
(\human ->
human
|> Json.encode human6Coder
|> E.encode 0
|> D.decodeString (Json.decode human6Coder)
|> Result.map Tuple.first
|> Expect.equal (Ok human)
)
]
, describe "Human7"
[ fuzz human7Fuzzer
"Recoding succeeds"
(\human ->
human
|> Json.encode human7Coder
|> E.encode 0
|> D.decodeString (Json.decode human7Coder)
|> Result.map Tuple.first
|> Expect.equal (Ok human)
)
]
, describe "Human8"
[ fuzz human8Fuzzer
"Recoding succeeds"
(\human ->
human
|> Json.encode human8Coder
|> E.encode 0
|> D.decodeString (Json.decode human8Coder)
|> Result.map Tuple.first
|> Expect.equal (Ok human)
)
]
, describe "MegaHuman"
[ fuzz megahumanFuzzer
"Recoding succeeds"
(\megahuman ->
megahuman
|> Json.encode megaHumanCoder
|> E.encode 0
|> D.decodeString (Json.decode megaHumanCoder)
|> Result.map Tuple.first
|> Expect.equal (Ok megahuman)
)
]
]

View File

@ -2,6 +2,7 @@ module Test.Tools.Mashdict exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Tools.Json as Json
import Internal.Tools.Mashdict as Mashdict exposing (Mashdict)
import Internal.Values.Event as Event
import Json.Decode as D
@ -93,11 +94,11 @@ suite =
"JSON encode -> JSON decode"
(\indent ->
Mashdict.empty Just
|> Mashdict.encode E.string
|> Json.encode (Mashdict.coder Just Json.string)
|> E.encode indent
|> D.decodeString (Mashdict.decoder Just D.string)
|> Result.map (Mashdict.isEqual (Mashdict.empty Just))
|> Expect.equal (Ok True)
|> D.decodeString (Json.decode <| Mashdict.coder Just Json.string)
|> Result.map (Tuple.mapFirst <| Mashdict.isEqual (Mashdict.empty Just))
|> Expect.equal (Ok ( True, [] ))
)
]
, describe "singleton"
@ -194,11 +195,11 @@ suite =
"JSON encode -> JSON decode"
(\hashdict indent ->
hashdict
|> Mashdict.encode Event.encode
|> Json.encode (Mashdict.coder .stateKey Event.coder)
|> E.encode indent
|> D.decodeString (Mashdict.decoder .stateKey Event.decoder)
|> Result.map Mashdict.toList
|> Expect.equal (Ok <| Mashdict.toList hashdict)
|> D.decodeString (Json.decode <| Mashdict.coder .stateKey Event.coder)
|> Result.map (Tuple.first >> Mashdict.toList)
|> Expect.equal (Ok (Mashdict.toList hashdict))
)
]
]

View File

@ -26,7 +26,7 @@ suite =
|> Timestamp.encode
|> E.encode indent
|> D.decodeString Timestamp.decoder
|> Expect.equal (Ok time)
|> Expect.equal (Ok ( time, [] ))
)
, fuzz fuzzer
"JSON decode -> millis"
@ -42,7 +42,7 @@ suite =
n
|> E.int
|> D.decodeValue Timestamp.decoder
|> Expect.equal (Ok <| Time.millisToPosix n)
|> Expect.equal (Ok ( Time.millisToPosix n, [] ))
)
]
, describe "Identity"

View File

@ -138,6 +138,6 @@ json =
context
|> Context.encode
|> D.decodeValue Context.decoder
|> Expect.equal (Ok context)
|> Expect.equal (Ok ( context, [] ))
)
]

View File

@ -3,6 +3,7 @@ module Test.Values.Envelope exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Config.Default as Default
import Internal.Tools.Json as Json
import Internal.Values.Envelope as Envelope exposing (Envelope)
import Json.Decode as D
import Json.Encode as E
@ -56,10 +57,10 @@ suite =
"JSON encode -> JSON decode"
(\envelope indent ->
envelope
|> Envelope.encode E.string
|> Envelope.encode Json.string
|> E.encode indent
|> D.decodeString (Envelope.decoder D.string)
|> Expect.equal (Ok envelope)
|> D.decodeString (Envelope.decoder Json.string)
|> Expect.equal (Ok ( envelope, [] ))
)
]
]

View File

@ -1,9 +1,11 @@
module Test.Values.Event exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Values.Event as Event exposing (Event)
import Json.Encode as E
import Test exposing (..)
import Test.Grammar.UserId as UserId
import Test.Tools.Timestamp as TestTimestamp
@ -14,7 +16,7 @@ fuzzer =
Fuzz.string
TestTimestamp.fuzzer
Fuzz.string
Fuzz.string
UserId.fullUserFuzzer
(Fuzz.maybe Fuzz.string)
Fuzz.string
(Fuzz.maybe unsignedDataFuzzer)
@ -65,3 +67,15 @@ valueFuzzer =
, Fuzz.map (E.list E.string) (Fuzz.list Fuzz.string)
, Fuzz.map Event.encode (Fuzz.lazy (\_ -> fuzzer))
]
suite : Test
suite =
describe "Sanity check"
[ fuzz fuzzer
"event = event"
(\event ->
Event.isEqual event event
|> Expect.equal True
)
]

View File

@ -61,7 +61,7 @@ suite =
, test "JSON decode {} is init"
("{}"
|> D.decodeString Settings.decoder
|> Expect.equal (Ok Settings.init)
|> Expect.equal (Ok ( Settings.init, [] ))
|> always
)
]
@ -74,7 +74,7 @@ suite =
|> Settings.encode
|> E.encode indent
|> D.decodeString Settings.decoder
|> Expect.equal (Ok settings)
|> Expect.equal (Ok ( settings, [] ))
)
]
]

View File

@ -84,7 +84,7 @@ suite =
|> StateManager.encode
|> E.encode 0
|> D.decodeString StateManager.decoder
|> Expect.equal (Ok StateManager.empty)
|> Expect.equal (Ok ( StateManager.empty, [] ))
|> always
)
]

View File

@ -0,0 +1,383 @@
module Test.Values.Timeline exposing (..)
import Expect
import Fuzz exposing (Fuzzer)
import Internal.Filter.Timeline as Filter exposing (Filter)
import Internal.Tools.Json as Json
import Internal.Values.Timeline as Timeline exposing (Batch, Timeline)
import Json.Decode as D
import Test exposing (..)
import Test.Filter.Timeline as TestFilter
fuzzer : Fuzzer Timeline
fuzzer =
TestFilter.fuzzer
|> Fuzz.andThen
(\globalFilter ->
Fuzz.oneOf
[ Fuzz.map2
(\start batches ->
List.foldl
(\b ( s, f ) ->
( b.end
, f >> Timeline.insert { b | start = Just s, filter = globalFilter }
)
)
( start, identity )
batches
|> Tuple.second
)
Fuzz.string
(Fuzz.listOfLengthBetween 0 10 fuzzerBatch)
, Fuzz.map2
(\start batches ->
List.foldl
(\b ( s, f ) ->
( b.end
, f >> Timeline.insert { b | start = Just s, filter = Filter.and globalFilter b.filter }
)
)
( start, identity )
batches
|> Tuple.second
)
Fuzz.string
(Fuzz.listOfLengthBetween 0 4 fuzzerBatch)
, Fuzz.map2
(\start batches ->
List.foldl
(\b ( s, f ) ->
( b.end
, f >> Timeline.addSync { b | start = Just s, filter = globalFilter }
)
)
( start, identity )
batches
|> Tuple.second
)
Fuzz.string
(Fuzz.listOfLengthBetween 0 10 fuzzerBatch)
, Fuzz.map2
(\start batches ->
List.foldl
(\b ( s, f ) ->
( b.end
, f >> Timeline.addSync { b | start = Just s, filter = Filter.and globalFilter b.filter }
)
)
( start, identity )
batches
|> Tuple.second
)
Fuzz.string
(Fuzz.listOfLengthBetween 0 4 fuzzerBatch)
]
|> Fuzz.listOfLengthBetween 0 10
|> Fuzz.map (List.foldl (<|) Timeline.empty)
)
fuzzerBatch : Fuzzer Batch
fuzzerBatch =
Fuzz.map4 Batch
(Fuzz.list Fuzz.string)
TestFilter.fuzzer
(Fuzz.maybe Fuzz.string)
Fuzz.string
suite : Test
suite =
describe "Timeline"
[ describe "most recent events with filters"
[ fuzz TestFilter.fuzzer
"Events are returned properly"
(\filter ->
Timeline.empty
|> Timeline.insert
{ events = [ "a", "b", "c" ]
, filter = filter
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.insert
{ events = [ "d", "e", "f" ]
, filter = filter
, start = Just "token_2"
, end = "token_3"
}
|> Timeline.mostRecentEventsFrom filter "token_3"
|> Expect.equal
[ [ "a", "b", "c", "d", "e", "f" ] ]
)
, fuzz2 TestFilter.fuzzer
TestFilter.fuzzer
"Sub-events get the same results"
(\f1 f2 ->
let
subFilter =
Filter.and f1 f2
in
Timeline.empty
|> Timeline.insert
{ events = [ "a", "b", "c" ]
, filter = f1
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.insert
{ events = [ "d", "e", "f" ]
, filter = f1
, start = Just "token_2"
, end = "token_3"
}
|> Timeline.mostRecentEventsFrom subFilter "token_3"
|> Expect.equal
[ [ "a", "b", "c", "d", "e", "f" ] ]
)
, fuzz2 TestFilter.fuzzer
TestFilter.fuzzer
"ONLY same result if sub-filter"
(\f1 f2 ->
Timeline.empty
|> Timeline.insert
{ events = [ "a", "b", "c" ]
, filter = f1
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.insert
{ events = [ "d", "e", "f" ]
, filter = f1
, start = Just "token_2"
, end = "token_3"
}
|> Timeline.mostRecentEventsFrom f2 "token_3"
|> (\events ->
Expect.equal
(Filter.subsetOf f1 f2)
(events == [ [ "a", "b", "c", "d", "e", "f" ] ])
)
)
]
, describe "Forks in the road"
[ fuzz2 TestFilter.fuzzer
TestFilter.fuzzer
"Two options returned"
(\f1 f2 ->
let
subFilter =
Filter.and f1 f2
in
Timeline.empty
|> Timeline.insert
{ events = [ "a", "b", "c" ]
, filter = f1
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.insert
{ events = [ "d", "e", "f" ]
, filter = f2
, start = Just "token_3"
, end = "token_2"
}
|> Timeline.insert
{ events = [ "g", "h", "i" ]
, filter = subFilter
, start = Just "token_2"
, end = "token_4"
}
|> Timeline.mostRecentEventsFrom subFilter "token_4"
|> Expect.equal
[ [ "a", "b", "c", "g", "h", "i" ]
, [ "d", "e", "f", "g", "h", "i" ]
]
)
]
, describe "Gaps"
[ fuzz TestFilter.fuzzer
"Gaps leave behind old events"
(\filter ->
Timeline.empty
|> Timeline.insert
{ events = [ "a", "b", "c" ]
, filter = filter
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.insert
{ events = [ "d", "e", "f" ]
, filter = filter
, start = Just "token_3"
, end = "token_4"
}
|> Timeline.mostRecentEventsFrom filter "token_4"
|> Expect.equal [ [ "d", "e", "f" ] ]
)
, fuzz3 TestFilter.fuzzer
(Fuzz.list Fuzz.string)
(Fuzz.pair (Fuzz.list Fuzz.string) (Fuzz.list Fuzz.string))
"Gaps can be bridged"
(\filter l1 ( l2, l3 ) ->
Timeline.empty
|> Timeline.insert
{ events = l1
, filter = filter
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.insert
{ events = l3
, filter = filter
, start = Just "token_3"
, end = "token_4"
}
|> Timeline.insert
{ events = l2
, filter = filter
, start = Just "token_2"
, end = "token_3"
}
|> Timeline.mostRecentEventsFrom filter "token_4"
|> Expect.equal [ List.concat [ l1, l2, l3 ] ]
)
]
, describe "JSON"
[ fuzz fuzzer
"Encode + Decode gives same output"
(\timeline ->
timeline
|> Json.encode Timeline.coder
|> D.decodeValue (Json.decode Timeline.coder)
|> Result.map Tuple.first
|> Result.map (Timeline.mostRecentEvents Filter.pass)
|> Expect.equal (Ok <| Timeline.mostRecentEvents Filter.pass timeline)
)
]
, describe "Weird loops"
[ fuzz TestFilter.fuzzer
"Weird loops stop looping"
(\filter ->
Timeline.empty
|> Timeline.insert
{ events = [ "a", "b", "c" ]
, filter = filter
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.insert
{ events = [ "d", "e", "f" ]
, filter = filter
, start = Just "token_2"
, end = "token_3"
}
|> Timeline.insert
{ events = [ "g", "h", "i" ]
, filter = filter
, start = Just "token_3"
, end = "token_2"
}
|> Timeline.mostRecentEventsFrom filter "token_2"
|> Expect.equal
[ [ "a", "b", "c" ]
, [ "d", "e", "f", "g", "h", "i" ]
]
)
]
, describe "Sync"
[ fuzz TestFilter.fuzzer
"Sync fills gaps"
(\filter ->
Timeline.empty
|> Timeline.addSync
{ events = [ "a", "b", "c" ]
, filter = filter
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.addSync
{ events = [ "f", "g", "h" ]
, filter = filter
, start = Just "token_3"
, end = "token_4"
}
|> Timeline.insert
{ events = [ "d", "e" ]
, filter = filter
, start = Just "token_2"
, end = "token_3"
}
|> Timeline.mostRecentEvents filter
|> Expect.equal [ [ "a", "b", "c", "d", "e", "f", "g", "h" ] ]
)
, fuzz TestFilter.fuzzer
"Sync doesn't fill open gaps"
(\filter ->
Timeline.empty
|> Timeline.addSync
{ events = [ "a", "b", "c" ]
, filter = filter
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.addSync
{ events = [ "f", "g", "h" ]
, filter = filter
, start = Just "token_3"
, end = "token_4"
}
|> Timeline.mostRecentEvents filter
|> Expect.equal [ [ "f", "g", "h" ] ]
)
, fuzz3 (Fuzz.pair Fuzz.string Fuzz.string)
fuzzer
TestFilter.fuzzer
"Getting /sync is the same as getting from the token"
(\( start, end ) timeline filter ->
let
t : Timeline
t =
Timeline.addSync
{ events = [ "a", "b", "c" ]
, filter = filter
, start = Just start
, end = end
}
timeline
in
Expect.equal
(Timeline.mostRecentEvents filter t)
(Timeline.mostRecentEventsFrom filter end t)
)
, fuzz TestFilter.fuzzer
"Weird loops stop looping"
(\filter ->
Timeline.empty
|> Timeline.insert
{ events = [ "a", "b", "c" ]
, filter = filter
, start = Just "token_1"
, end = "token_2"
}
|> Timeline.insert
{ events = [ "d", "e", "f" ]
, filter = filter
, start = Just "token_2"
, end = "token_3"
}
|> Timeline.insert
{ events = [ "g", "h", "i" ]
, filter = filter
, start = Just "token_3"
, end = "token_2"
}
|> Timeline.mostRecentEventsFrom filter "token_2"
|> Expect.equal
[ [ "a", "b", "c" ]
, [ "d", "e", "f", "g", "h", "i" ]
]
)
]
]