Feed Aggregator Page 657
Rendered on Thu, 08 Oct 2020 22:33:25 GMT
Rendered on Thu, 08 Oct 2020 22:33:25 GMT
via Elm - Latest posts by @berend Berend de Boer on Thu, 08 Oct 2020 22:09:30 GMT
But that’s not how Elm works anywhere.
via Elm - Latest posts by @ream88 Mario Uher on Thu, 08 Oct 2020 22:04:44 GMT
Hey , as the title says, I think I found a bug in Elm 0.19.
To understand why this baffled me, let me explain you what I did: I have finally found some time to update the codebase at Yodel from 0.18 to Elm 0.19.1 one app at a time. Some circular dependencies inside our apps and the big changes that happened in the 0.19 update blocked me until now (the first PR I started was in August 2018, but was never finished).
Ok, long story short, we have a module called ShortUUID
which allows us to convert (long) UUIDs like "64d7280f-736a-4ffa-b9c0-383f43486d0b"
to a shorter version ("DTEETeS5R2XxjrVTZxXoJS"
) back and forth. You can find it here. I converted the code to 0.19 and also switched from the unsupported hickscorp/elm-bigint
to the compatible cmditch/elm-bigint
. And my tests kept failing for an hour or so. I suspected the new BigInt
package and wanted to call it a day. And then I found the problem:
This is the working version:
encodeHelper : String -> BigInt -> String
encodeHelper output input =
if BigInt.gt input zero then
let
index =
input
|> BigInt.modBy abcLength
|> Maybe.map BigInt.toString
|> Maybe.andThen String.toInt
|> Maybe.withDefault 0
char =
abc
|> List.getAt index
|> Maybe.withDefault ' '
newInput =
BigInt.div input abcLength
newOutput =
String.cons char output
in
encodeHelper newOutput newInput
else
output
and this is the broken one:
encodeHelper : String -> BigInt -> String
encodeHelper output input =
if BigInt.gt input zero then
let
index =
newInput
|> BigInt.modBy abcLength
|> Maybe.map BigInt.toString
|> Maybe.andThen String.toInt
|> Maybe.withDefault 0
char =
abc
|> List.getAt index
|> Maybe.withDefault ' '
newInput =
BigInt.div input abcLength
newOutput =
String.cons char output
in
encodeHelper newOutput newInput
else
output
Do you see the difference? I used newInput
in the first index
assignment instead of just input
. And the compiler did not say a single word, which is very Elm-unlike.
newInput
is clearly defined below, and should in my opinion not be available above its definition.
My question now: Is this intended behavior or a bug in the Elm compiler?
via Elm - Latest posts by @Atlewee Atle Wee Førre on Thu, 08 Oct 2020 22:00:09 GMT
Do you need ports at all? Would it not be better to only use one or more custom elements?
via Elm - Latest posts by @system system on Thu, 08 Oct 2020 21:06:10 GMT
This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.
via Elm - Latest posts by @marcw Marc Walter on Thu, 08 Oct 2020 19:19:40 GMT
I created a WebRTC conference example with custom elements (often called webcomponents) for rendering. The behavior is similar to the simple example from the W3C WebRTC spec.
My main goal was to NOT have two applications (one in Elm, one in JS) that need to synchronize state, but instead keep all state inside the Elm app, use ports to mutate the opaque JSON values, and display video streams using custom elements.
When comparing it to a solution without custom elements, it is obvious that I was able to make do with fewer ports.
I definitely liked that it allowed me to react to DOM events emitted by the custom elements instead of needing to use ports. Which would either listen to many more events at a time than actually needed, or where I had to create many different subscriptions depending on the app state.
Instead I could write it like this inside the view functions to e.g. wait until the custom elment creates a new RTCPeerConnection object.
-- in ./src/Active/View.elm
viewPending : Model.PendingUser -> Html Msg
viewPending user =
H.node "webrtc-media"
[ ...
, onCustomEvent "new-peer-connection" (Msg.UserUpdated user.id) Msg.peerConnectionDecoder
]
[]
And then use a port to add the local media stream to it, and create an SDP offer.
-- in ./src/Active/Update.elm
updatePendingUser : Msg.UserId -> Model.Stream -> Msg.Updated -> Model.PendingUser -> ( User, Cmd msg )
updatePendingUser ownId localStream msg user =
case msg of
Msg.NewPeerConnection pc ->
( Model.User ...
, if ownId < user.id then
-- executes `initiateSdpOffer/3 in ./src/index.js
Ports.Out.createSdpOfferFor user.id pc localStream
else
.. -- in this case we would expect to receive an SDP offer from the other peer
)
...
This offer will then be sent over the signaling server to the other browser and start the negotation process.
Elm code and a simple signaling server are available on github.
I would also love to hear feedback, especially if it makes sense in your point of view to rely on custom element communication or not, or if someone noticed problems and would think ports would offer a cleaner approach.
via Elm - Latest posts by @meowgorithm Christian Rocha on Thu, 08 Oct 2020 18:56:53 GMT
Caveat: this is a project written in Go which implements the Elm Architecture.
Hi Everyone! My colleagues and I wrote an Elm-inspired framework in Go for building rich terminal applications. We attempted to follow the Elm Framework fairly closely, even down to smaller details like the way Time.every
is implemented. We even had a subscription model for a long time, though we ultimately removed it because of the complexity of the implementation. I’d love to see it return, however.
As an Elm enthusiast, it’s been a joy to use Elm’s functional paradigms for the CLI. It’s equally nice to see how well the Elm Architecture can translate across languages. We chose Go mainly for it’s practicality with dependable tooling and very relevant cross-platform support.
We also have a core library of sorts:
And a few hefty projects we’ve written using the framework.
Would love to hear any thoughts and happy to answer any questions!
via Elm - Latest posts by @system system on Thu, 08 Oct 2020 18:47:23 GMT
This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.
via Elm - Latest posts by @lazurski Tad Lispy on Thu, 08 Oct 2020 12:58:56 GMT
Very interesting indeed! Thanks for sharing.
via Elm - Latest posts by @rupert Rupert Smith on Thu, 08 Oct 2020 12:26:12 GMT
It occurs to me that you might find this Elm Europe talk interesting:
via Elm - Latest posts by @lazurski Tad Lispy on Thu, 08 Oct 2020 12:07:11 GMT
Yes! I think your technical explanation is also correct. Basically if I type b
here: a|c
(pipe represents the cursor), then the browser immediately updates the DOM value to abc
, but Elm update brings it back to ab
(in accordance with the model that didn’t change yet). Then the time message comes in (probably in the next animation frame as you suggested) and the text input gets the final final value of abc
. Kind of back and forth. From the browser’s perspective the last two updates are not triggered by input event and this must be throwing off the cursor.
If this is correct then only two ways to prevent it is to either make Time.now
synchronous (probably impossible) or update the model in the first pass of the update
and later update the version.
I really like the solution suggested by @eike (and in part @rupert). In fact I can delay the whole timestamp business until the user exports the document (in my case it’s either uploading it to the server or downloading a file).
So all I need to do is count the changes since the document was loaded. And since I don’t even care about how many times it was changed, only that it was, I can represent the version as Maybe Time.Posix
where Nothing
indicates that the document was modified since last export. And when the document is exported check if it was modified (version == Nothing
). If so, then update the version before it’s saved. If it’s Just version
then keep it as it was.
This should also improve performance when typing - only one update per keystroke!
Thank you all your input. It will help me to model my program better!
via Elm - Latest posts by @SupriyaR SupriyaR on Thu, 08 Oct 2020 11:25:07 GMT
Hello Kaspar,
We at Clarion Technologies offer the same kind of services that you’re looking for. Hire developers from clarion who works as your in-house team.
To discuss more about your requirement kindly reach me over email: supriya.rathi@clariontechnologies.co.in
Thanks
via Elm - Latest posts by @albertdahlin Albert Dahlin on Thu, 08 Oct 2020 11:12:09 GMT
As I see it, there are two problems here:
My (partial) understanding of your problem with atomic updates is that you need both a string and a time. In other words, you need to unambiguously pair a text with the corresponding timestamp.
The cursor jumping problem occurs when the virtual dom updates the content of the text area. As far as I understand, this only happens if the value in the model is different from the <textarea value="">
in the DOM. The vdom only updates the textarea if the values are different. When using tasks with Time.now
the resulting update seems to happen in the next animation frame.
My guess is that the flow of events would look something like this:
Maybe a solution could be to separate your Doc type and the view state for the textarea? Something along these lines:
type alias Model =
{ text : String -- populate this with the string represantation from Doc when it changes?
, doc : Doc
}
type Msg
= TextareaChanged String
| GotTimeFor String Time.Posix
update msg model =
case msg of
TextareaChanged text ->
( { model | text = text }
, Task.perform (GotTimeFor text) Time.now
)
GotTimeFor text time ->
handle your atomic update
via Elm - Latest posts by @eike Eike Schulte on Thu, 08 Oct 2020 11:09:37 GMT
If you really want to go with time stamps, you might also try going with a hybrid approach where you first assign a numerical version number, so you can update the model immediately and at the same time request a time stamp for it. Once the time stamp arrives, you replace the version number by the time stamp. So you would have
type Version = Preliminary Int | Permanent TimeStamp
and a message GetTimeStamp Int TimeStamp
(so you can combine the right time stamp with the right preliminary version number).
via Elm - Latest posts by @lazurski Tad Lispy on Thu, 08 Oct 2020 11:02:38 GMT
Right. That’s the good part of your solution - it’s atomic. But I was thinking about a scenario where two users open the document (version 1). Each of them makes one change (maybe one removes a section and the other inserts a comma). Each of them have version 2 now, but they are two different versions. With timestamps it’s very unlikely.
Also the semantic nature of timestamp is really good in my application. Granted that in some applications a counter, checksum, etc would be preferable. But I’ve put some thought into this and believe the timestamp will work the best in the overall design of the system.
For now I do roughly what @albertdahlin suggested, but I’m still hoping for a better solution.
via Elm - Latest posts by @rupert Rupert Smith on Thu, 08 Oct 2020 09:53:59 GMT
The update
function in Elm is single threaded - as javascript in the browser is single threaded. This means that it cannot have a race condition where 2 updates get the same value - the increments will be ‘atomic’.
I would actually say that using a timestamp could result in 2 updates getting the same timestamp. If they were processed in succession very quickly, they might end up getting the same millisecond timestamp. I am not sure though, how quickly in succession Elm can call Time.now
, and whether this it is really possible to get the same timestamp out of it.
via Elm - Latest posts by @albertdahlin Albert Dahlin on Thu, 08 Oct 2020 09:30:34 GMT
That is a good example of tricky concurrency that is not obvious at first.
I would like to elaborate on some things
view
and the DOM is always “behind” the Model. It will update at most every animation frame (commonly every 16ms but might be slower) and always run after the call to update
view
type Msg
= SearchInputChanged String
| GotSearchSuggestions (Result Http.Error (List String))
update msg model =
case msg of
SearchInputChanged newText ->
( { model | searchText = newText }
, fetchSuggestionsFromServer newText
)
GotSearchSuggestions (Ok suggestions) ->
( { model | suggestions = suggestions }
, Cmd.none
)
GotSearchSuggestions (Error_) ->
( model
, Cmd.none
)
fetchSuggestionsFromServer text =
Http.get
{ url = "http://example.com/?search=" ++ text
, expect = Http.expectJson GotSearchSuggestions (Json.Decode.list Json.Decode.string)
}
The problem is the assumption that the response from the server will come in the order requests where sent. If two requests are sent and the first takes longer for the server to process the responses will arrive in reversed order. This will result in model.suggestions
containing stale data.
A solution here is to include the search text in the reply msg by adding an argument to GotSearchSuggestions
:
type Msg
= SearchInputChanged String
| GotSearchSuggestions String (Result Http.Error (List String))
Now you can check if the suggestions you received match model.searchText
and if not, throw the result away.
via Elm - Latest posts by @swisscheese Jim Lewis on Thu, 08 Oct 2020 09:16:05 GMT
I’m no longer looking for a tutor. Thanks for the PM replies.
via Elm - Latest posts by @lazurski Tad Lispy on Thu, 08 Oct 2020 09:14:28 GMT
Thank you for your input. I really appreciate it!
You (and @rupert) are right about this. Please consider that my Ellie is a very reduced example. As shown in the apply
function I only pass the Doc
, not the whole Model
!
The problem with your solution @albertdahlin is that the update is not atomic. Basically you change the text and then the time separately. In the real app I have tens of possible events and transformations. Each has to be handled in the update
function. Also the document is in a Maybe
, so every time I have to account for it being Nothing
. In my current solution (that breaks input elements) I just return:
let
transform =
Doc.setText value
in
( model
, model.doc
|> Maybe.map (Doc.apply transform)
|> Maybe.map (Task.perform DocTransformationApplied)
|> Maybe.withDefault Cmd.none
)
I want to make it impossible to update the document without updating the version (as the guru said “make invalid states impossible”).
If I won’t find a better way I will do it like you suggested, but I hope to get keep the updates atomic.
It would solve the issue of double updates, but it has two downsides. (1) If two users update the document concurrently there might be a version clash. This is very unlikely with a timestamp. I could use something like UUID, but then I’m loosing ability to order by version (2) Having version as date is user friendly. It’s nicer to show
this happened when version from Friday, 13:22 was published
than
this happened when version 23553 was published
consider that every keystroke in any input increments the version, so numbers will get huge very quickly.
via Elm - Latest posts by @rupert Rupert Smith on Thu, 08 Oct 2020 08:46:20 GMT
If you are using the timestamp to version documents, in order to ensure that each version is uniquely tagged and ordered, might you be able to achieve the same thing by using a simple counter? That is, instead of getting the timestamp on each document update, just store a current version as an Int
in the model and increment it each time?
I would second this - I made an Ellie to show what can happen if multiple events fire in quick succession. A later update can overwrite the model with stale data from an earlier state:
via Elm - Latest posts by @albertdahlin Albert Dahlin on Thu, 08 Oct 2020 08:33:48 GMT
I’ve made an example where the cursor does not jump.
https://ellie-app.com/bbMMCjXngNba1
In general I think one should avoid passing the Model in a Msg since this can cause tricky concurrency behaviours that are hard to reason about.