3.0.0-alpha.0

This release went quite smooth and FAST thanks to the recent docker changes, now we’re pulling docker images instead of building them.

Improve parser’s e2e test:

The problem was I loaded real feeds into the feedQueue and made it download re…


This content originally appeared on DEV Community and was authored by Tue

This release went quite smooth and FAST thanks to the recent docker changes, now we're pulling docker images instead of building them.

Improve parser's e2e test:

The problem was I loaded real feeds into the feedQueue and made it download real resource through network requests. This approach wasn't correct and prone to random failures. David suggested me using the resources from test-web-content. Looking back, I think I need to create another xml feed, these feeds below are duplicate. PR #3773

const valid = [
      {
        author: 'Tue Nguyen',
        url: 'http://localhost:8888/feed.xml',
      },
      {
        author: 'Antonio Bennett',
        url: 'http://localhost:8888/feed.xml',
      },
    ];

And for invalid feeds I just gave the queue non-existent feed urls

const invalid = [
      {
        author: 'John Doe',
        url: 'https://johnhasinvalidfeed.com/feed',
      },
      {
        author: 'Jane Doe',
        url: 'https://janehasinvalidfeed.com/feed',
      },
    ];

Change parser to pull feeds from Supabase:

I added SUPABASE_URL & ANON_KEY to parser container for it to connect to the database and wrote a function that returns all feeds. PR #3363

  async getAllFeeds() {
    const { data: feeds, error } = await supabase.from('feeds').select('wiki_author_name, url');
    if (error) {
      logger.warn({ error });
      throw Error(error.message, "can't fetch feeds from supabase");
    }
    const formattedFeeds = feeds.map((feed) => ({
      author: feed.wiki_author_name,
      url: feed.url,
    }));
    return formattedFeeds;
  },

Some pieces of the tests had to be modified and wiki parsing code was removed.

Remove src/backend:

This one is huge but at least I didn't have to write new code 😅. Most of the work was to determine which part of the legacy backend was still used by other services. Jerry helped me a lot with this PR, we went through files and found that API_URL was still passed to services though, it had only one use in /src/web/app/src/pages/_document.tsx.
The PR does work but there's still work to be done and tested.

What I've learned doing reviews or looking at other PRs 😛:

  • SWR react hook from PR #3240
  • Interesting way to solve the back up issue PR #3405
  • Use ORM to manage database change PR #3418


This content originally appeared on DEV Community and was authored by Tue


Print Share Comment Cite Upload Translate Updates
APA

Tue | Sciencx (2022-04-09T02:30:20+00:00) 3.0.0-alpha.0. Retrieved from https://www.scien.cx/2022/04/09/3-0-0-alpha-0/

MLA
" » 3.0.0-alpha.0." Tue | Sciencx - Saturday April 9, 2022, https://www.scien.cx/2022/04/09/3-0-0-alpha-0/
HARVARD
Tue | Sciencx Saturday April 9, 2022 » 3.0.0-alpha.0., viewed ,<https://www.scien.cx/2022/04/09/3-0-0-alpha-0/>
VANCOUVER
Tue | Sciencx - » 3.0.0-alpha.0. [Internet]. [Accessed ]. Available from: https://www.scien.cx/2022/04/09/3-0-0-alpha-0/
CHICAGO
" » 3.0.0-alpha.0." Tue | Sciencx - Accessed . https://www.scien.cx/2022/04/09/3-0-0-alpha-0/
IEEE
" » 3.0.0-alpha.0." Tue | Sciencx [Online]. Available: https://www.scien.cx/2022/04/09/3-0-0-alpha-0/. [Accessed: ]
rf:citation
» 3.0.0-alpha.0 | Tue | Sciencx | https://www.scien.cx/2022/04/09/3-0-0-alpha-0/ |

Please log in to upload a file.




There are no updates yet.
Click the Upload button above to add an update.

You must be logged in to translate posts. Please log in or register.