Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update/dependencies #36

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

rssdev10
Copy link

Update dependencies. Reduce version requirements.

  • accompanying code changes

@rssdev10 rssdev10 marked this pull request as draft November 19, 2023 06:40
@rssdev10
Copy link
Author

Hi, I'm trying to make TextModels compilable again.
Result of testing:

Test Summary:                  | Pass  Error  Total   Time
All tests                      |   93      3     96  23.4s
  📂 crf.jl                    |           3      3   3.4s
    crf                        |           3      3   3.3s
      Loss function            |           1      1   1.0s
      Viterbi Decode           |           1      1   0.7s
      CRF with Flux Layers     |           1      1   1.6s
  📂 ner.jl                    |   13            13   5.0s
    NER                        |   13            13   5.0s
  📂 pos.jl                    |   14            14   1.0s
    POS                        |   14            14   1.0s
  📂 sentiment.jl              |    6             6   0.6s
  📂 averagePerceptronTagger.jl |   13            13   7.4s
    Average Perceptron Tagger  |   13            13   7.4s
  📂 ulmfit.jl                 |   47            47   5.8s
    Custom layers              |   38            38   3.0s
    Language model             |    7             7   1.6s
    Text Classifier            |    2             2   0.8s
ERROR: LoadError: Some tests did not pass: 93 passed, 0 failed, 3 errored, 0 broken.

ulmfit was updated for Flux >=0.13 and, based on the unit testing this works fine.

The biggest issue is crf. I'm not sure how that was working before looking in to the unit test. The problem is https://github.com/JuliaText/TextModels.jl/blob/master/test/crf.jl#L7 :

        input_seq = [rand(4) for i in 1:3]
        c = CRF(2)

        scores = []
        push!(scores, score_sequence(c, input_seq, [onehot(1, 1:2), onehot(1, 1:2), onehot(1, 1:2)]))
        #...
        
        init_α = fill(-10000, (c.n + 2, 1))
        init_α[c.n + 1] = 0

        s1 = sum(exp.(scores))
        s2 = exp(forward_score(c, input_seq, init_α))

        @test (s1 - s2) / max(s1, s2) <= 0.00000001

score_sequence cannot be performed because of input_seq contains the array of 4 elements but each of onehot(1, 1:2) generates just 2 labels. So, onecold() inside score_sequence throws mismatch of size.

From other side,

        init_α = fill(-10000, (c.n + 2, 1))
        init_α[c.n + 1] = 0

        s2 = exp(forward_score(c, input_seq, init_α))

this code speaks about expected 4 elements in input_seq = [rand(4) for i in 1:3] but not 2 as mentioned number of labels in CRF(2) and onehot(1, 1:2)...

@AdarshKumar712 any suggestions about this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant