Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

try to work out the lighting of hydrophones when orcas are heard #43

Open
Devesh21700Kumar opened this issue Mar 26, 2021 · 8 comments
Open
Labels
enhancement New feature or request

Comments

@Devesh21700Kumar
Copy link
Contributor

Problem description:

When orcas are heard and their coordinates are mapped on the google sheets and then on the google sheets vector layer then the hydrophone around that coordinate should light up

@ivanoats would like to know more about this problem statement and then work out a solution for this that would include a light up hydrophone svg/icon/photo whenever the coordinates of hearing are mapped

@DhananjayPurohit
Copy link
Member

Based on my understandings, I think the idea of lightening up hydrophones is more associated with orcas detected by hydrophone (both by user listening to the hydrophone and ML tool). I think implementing a webhook to inform the map of lightening up any hydrophone will hold good in this case. A second thought can be adding a verified entry into the spreadsheet can help the hydrophone on the map to light up. It still needs a discussion @scottveirs @ivanoats.

@Devesh21700Kumar
Copy link
Contributor Author

Based on my understandings, I think the idea of lightening up hydrophones is more associated with orcas detected by hydrophone (both by user listening to the hydrophone and ML tool). I think implementing a webhook to inform the map of lightening up any hydrophone will hold good in this case. A second thought can be adding a verified entry into the spreadsheet can help the hydrophone on the map to light up. It still needs a discussion @scottveirs @ivanoats.

yes, a verified entry in the spreadheet is what i was also thinking as an approach

@ivanoats
Copy link
Member

Does a webhook depend on us moving orcamap to NextJS so that we can add /api routes to our orcamap app?

@Devesh21700Kumar
Copy link
Contributor Author

Does a webhook depend on us moving orcamap to NextJS so that we can add /api routes to our orcamap app?

i might be wrong here but I think that its would be compatible with both create-react-app and nest and is independent of the migration.

@DhananjayPurohit
Copy link
Member

Does a webhook depend on us moving orcamap to NextJS so that we can add /api routes to our orcamap app?

I had never worked before on NextJS, but I think using webhooks after migration to NextJS would be better as we can easily configure an API route into a webhook for accepting HTTP requests on events like changes in spreadsheet or any updates in SSEMMI validated/moderated data or on a button click from @scottveirs for informing about an activated hydrophone(to support the lighting of hydrophones).
This will resolve the problem of polling after every 10 seconds for new data and can save many API calls.
I would love to tag @aalaydrus (SSEMMI developer) in this discussion to know more on how we can trigger an event from SSEMMI server-side to inform our NextJS app about the updates in validated/moderated data?

@Devesh21700Kumar
Copy link
Contributor Author

Yes @DhananjayPurohit @ivanoats That's also true and as far as thhe features of apis with NEXTJS goes ,Any file inside the folder pages/api is mapped to /api/* and will be treated as an API endpoint instead of a page. They are server-side only bundles and won't increase our client-side bundle size.

With Next.js, we can leverage its built-in data fetching APIs to format our data and pre-render your site.
So even though webhook is independent of the mode used , it will definitely be smoother with nestjs

@scottveirs
Copy link
Member

At this stage, I can mainly offer the following current data sources:

Orcamap hydrophone locations could indicate "realtime activity" based on any or all of:

  1. When any human pushes the "I hear something" button within the Orcasound live-listening web app. Those go into a Postgres db on a Heroku instance. One could ask Skander and Paul on Slack about how to access them. I'm not sure if there is API access...

  2. When the real-time inference system built by Microsoft volunteer hackers generates any prediction above some threshold, and/or when an expert/moderator tags the prediction candidate as a true positive. Those candidates and their labels currently live in a Cosmo db in an AI for Orcas Azure account. On Slack, Prakruti or Akash would be the best contacts for learning about API etc access.

  3. When a sighting is reported near any of the hydrophones. The existing route for such visual location data is the Google sheet associated with Orcamap (via Google API) but the SSEMMI decentralized db API is also ready for testing, I think...

@scottveirs
Copy link
Member

Assuming this feature is honed a bit to ONLY "light up" Orcasound hydrphone locations when killer whales are being heard (as opposed to other soniferous species, like a humpback whale), then an emerging simpler method might be to leverage the Acaritia API and only pull acoustic detections that have been associated with the SRKW ecotype (and maybe later also the Bigg's ecotype). It could be done with the existing Comment field, but should likely await a revision of the Acartia data scheme.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants