Compare commits

...

18 commits

Author SHA1 Message Date
Simon Sarasova
ea82419b38
Implemented in-memory storage of trained neural network objects. Trained neural network objects now only have to be created once, so predictions are faster. 2024-08-15 12:14:23 +00:00
Simon Sarasova
91c2345fb3
Removed link to Seekia's defunct Tor onionsite. 2024-08-14 14:37:59 +00:00
Simon Sarasova
11f25c6c8e
Improved Whitepaper.md and Future-Plans.md. 2024-08-14 13:44:01 +00:00
Simon Sarasova
39c4edbe2f
Created the GetUserGenomeLocusValuesMapFromProfile function and used it to remove some duplicated code. 2024-08-14 11:25:25 +00:00
Simon Sarasova
b6f5612bbc
Added new genetic attributes to the calculatedAttributes package. Added the ability to view and sort by these attributes in the GUI. 2024-08-14 11:04:19 +00:00
Simon Sarasova
972252c788
Upgraded Golang to version 1.23. 2024-08-14 04:04:56 +00:00
Simon Sarasova
02676dbda1
Added the Obesity disease to genetic analyses. 2024-08-14 03:37:18 +00:00
Simon Sarasova
8bc2bc01f3
Implemented neural network prediction for polygenic diseases to replace old method. Added autism and homosexualness to genetic analyses. 2024-08-13 13:25:47 +00:00
Simon Sarasova
1f30bfa71c
Improved the helpers package. 2024-08-11 12:31:40 +00:00
Simon Sarasova
60ee8afb6c
Increased the quantity of attributes that are extracted from the OpenSNP biobank data archive. 2024-08-11 11:16:46 +00:00
Simon Sarasova
45e668c05a
Added numeric traits to genetic analyses. 2024-08-09 14:23:37 +00:00
Simon Sarasova
d769047de7
Improved Documentation.md and Future-Plans.md. 2024-08-08 03:16:00 +00:00
Simon Sarasova
124059cebe
Improved Future-Plans.md. 2024-08-08 00:15:27 +00:00
Simon Sarasova
3e56319878
Added Merkle Tree Payment Proofs to Future-Plans.md. 2024-08-07 09:01:42 +00:00
Simon Sarasova
fe754cb6a2
Added the Height trait to the Create Genetic Models utility. 2024-08-07 07:45:31 +00:00
Simon Sarasova
d538afc7a2
Added LocusIsPhased information to the local user profile creation process. 2024-08-05 21:29:14 +00:00
Simon Sarasova
b71b994dc4
Added some timestamps. 2024-08-05 07:39:03 +00:00
Simon Sarasova
03b8503b89
Added the Height trait to the traits package. Migrated locus metadata from json encoding to gob encoding. 2024-08-05 07:11:10 +00:00
158 changed files with 11396 additions and 9537 deletions

View file

@ -6,6 +6,22 @@ Small and insignificant changes may not be included in this log.
## Unversioned Changes
* Implemented in-memory storage of trained neural network objects. Trained neural network objects now only have to be created once, so predictions are faster. - *Simon Sarasova*
* Removed link to Seekia's defunct Tor onionsite. - *Simon Sarasova*
* Improved Whitepaper.md and Future-Plans.md. - *Simon Sarasova*
* Created the GetUserGenomeLocusValuesMapFromProfile function and used it to remove some duplicated code. - *Simon Sarasova*
* Added new genetic attributes to the calculatedAttributes package. Added the ability to view and sort by these attributes in the GUI. - *Simon Sarasova*
* Upgraded Golang to version 1.23. - *Simon Sarasova*
* Added the Obesity disease to genetic analyses. - *Simon Sarasova*
* Implemented neural network prediction for polygenic diseases to replace old method. Added autism and homosexualness to genetic analyses. - *Simon Sarasova*
* Increased the quantity of attributes that are extracted from the OpenSNP biobank data archive. - *Simon Sarasova*
* Added numeric traits to genetic analyses. - *Simon Sarasova*
* Improved Documentation.md and Future-Plans.md. - *Simon Sarasova*
* Improved Future-Plans.md. - *Simon Sarasova*
* Added Merkle Tree Payment Proofs to Future-Plans.md. - *Simon Sarasova*
* Added the Height trait to the Create Genetic Models utility. - *Simon Sarasova*
* Added LocusIsPhased information to the local user profile creation process. - *Simon Sarasova*
* Added the Height trait the traits package. Migrated locus metadata from json encoding to gob encoding. - *Simon Sarasova*
* Upgraded Fyne to version 2.5.0. - *Simon Sarasova*
* Added neural network trait prediction to genetic analyses. - *Simon Sarasova*
* Improved the Create Genetic Models utility and neural network training code. Models are now able to predict traits with some accuracy. - *Simon Sarasova*

View file

@ -9,4 +9,4 @@ Many other people have written code for modules which are imported by Seekia. Th
Name | Date Of First Commit | Number Of Commits
--- | --- | ---
Simon Sarasova | June 13, 2023 | 267
Simon Sarasova | June 13, 2023 | 285

View file

@ -34,8 +34,6 @@ Access Seekia's clearnet website at [Seekia.net](https://seekia.net).
Access Seekia's Ethereum IPFS ENS website at [Seekia.eth](ipns://seekia.eth). This site can be accessed through Brave Browser.
Access Seekia's Tor website at [seekia77v2rqfp4i4flavj425txtqjpn2yldadngdr45fjitr72fakid.onion](http://seekia77v2rqfp4i4flavj425txtqjpn2yldadngdr45fjitr72fakid.onion).
Read the whitepaper at `/documentation/Whitepaper.pdf`
Read the documentation at `/documentation/Documentation.md`

View file

@ -71,7 +71,7 @@ Each network type is described by a single byte. Mainnet == 1, Testnet1 == 2.
Multiple networks allow for the testing of new features on a test network before deploying them on the main network.
Each network has its own account credit database, account credit interface servers, network entry seeds, and parameters. Profiles, messages, reviews, reports, and parameters all contain a network type byte.
Each network has its own payment proof providers, network entry seeds, and parameters. Profiles, messages, reviews, reports, and parameters all contain a network type byte.
Users can switch their app's network type. Upon switching network types, the Seekia client will interface with the new network and delete downloaded database content from different networks. User data such as messages and chat keys are retained, so users can switch between networks without losing sensitive data.
@ -107,7 +107,7 @@ If hosts are hosting Host/Moderator identities, they must host all of them and t
### Host Identity Balance
Each host identity must be funded with a minimum amount of credit to participate in the network.
Each host identity must be funded with a minimum amount of cryptocurrency to participate in the network.
The gold rate/day in cost is defined by the network parameters.
@ -257,112 +257,49 @@ All Seekia messages, mate profiles, and reports must be funded to be hosted by t
Identities must be funded to have their profiles hosted by the network.
This is required to prevent network spam and discourage bad behavior.
Funding is required to prevent network spam and discourage bad behavior.
Without a financial cost to broadcasting content, a single actor could spam the network with billions of fake profiles/messages, rendering the network useless. By requiring funds, broadcasting spam costs an attacker money.
Seekia users can be banned if they engage in malicious behavior, so being a malicious user will cost money.
Users spend enough credit to have their profile hosted for as long as they initially desire. For example, if a Mate user wants to try out Seekia for 60 days, they fund their Mate identity for 60 days. They can extend their identity's balance any time they want.
Users spend enough cryptocurrency to have their profile hosted for as long as they initially desire. For example, if a Mate user wants to try out Seekia for 60 days, they fund their Mate identity for 60 days. They can extend their identity's balance any time they want.
Users must fund their mate/host identity for a minimum number of days. This only needs to be done once per each identity. The account credit servers will not allow a funding below a minimum number of days, if the mate/host identity has not already been funded in the past. Moderator identities are funded via Moderator Scores, which are described later in this document. Anyone can fund another user's identity, which is useful if that user's identity is close to expiring.
Users must initially fund their mate/host identity for a minimum number of days. This only needs to be done once per each identity. This is required to prevent an attacker from funding many identities for a very small amount of time (1 hour), and flooding the network with profiles. Moderator identities are funded via Moderator Scores, which are described later in this document. Anyone can fund another user's identity, which is useful if that user's identity is close to expiring.
Each mate profile must be funded individually for a flat fee. Without this, an attacker could replace their identity's mate profile thousands of times, which would spam the moderators with profiles to review. Host and moderator profiles do not have this issue, because these profiles do not need to be approved by the moderators. Host and Moderator profiles can be banned or approved, but they do not need to be approved before being downloaded or viewed by users.
Reports and messages must each be funded individually. Reports use a flat fee, whereas messages are funded based on their size. Larger messages are more expensive.
Reports and messages must each be funded individually. Reports use a flat fee, whereas messages are funded based on their size and network duration. Larger messages are more expensive.
The costs to fund identities/profiles/messages/reports are defined in the network parameters. All of the parameter costs must be updated in a way that allows a time period for all clients to update their parameters. Otherwise, some user clients will overpay/underpay because they have outdated costs.
If the spam on Seekia started to increase, the network admins would increase the costs. A perfect balance must be achieved which reduces the amount of spam and unrulefulness but keeps the cost low for users to participate.
To determine the funded status of an identity/profile/message/report, hosts and users request the information from the account credit servers.
To determine the funded status of an identity/profile/message/report, hosts and users request the information from hosts who provide blockchain information.
## Account Credit
## Funding Content
Seekia is not a fully decentralized network.
Seekia uses cryptocurrency to fund identities, mate profiles, reports, and messages on the network.
Seekia uses Credit rather than cryptocurrency to fund host/mate identities, reports, and messages on the network. Moderator identity scores do not use Credit, and instead use cryptocurrency.
Non-moderator identities, reports, mate profiles, and messages are all funded using payment proofs.
Account credit is used instead of cryptocurrency for 2 reasons: Privacy and Scalability.
*TODO: Explain merkle tree payment proofs. An explanation currently exists in Future-Plans.md.*
In a fully decentralized model, the funding of messages, reports, mate profiles, and identities would be accomplished with private blockchain transactions. An example of this is a zero knowledge accumulator, where each transaction is unlinkable.
The price of funding content/identities on the network is represented in gold. To calculate the amount of funds sent to a particular address, Seekia multiplies the amount of crypto sent in each payment by the gold exchange rate at the transation time described in the parameters to determine the total amount of gold sent to the address.
Supporting 10,000 messages per second would require a blockchain that can support 10,000 private transactions per second, along with a built-in wallet within the application.
Moderator identity scores are not funded with payment proofs. Moderators use crypto addresses derived from their identity hash. This makes it easier to calculate a moderator's identity score, as downloading payment proofs is not necessary. It also reduces the amount of data that the Seekia network needs to maintain in perpetuity. Moderators should use blockchain privacy tools to fund their identity scores to avoid linking their crypto wallets with their moderator identity.
Due to the scaling limitations of privacy-preserving blockchains, the network relies on a central account credit database to perform accounting privately.
All Seekia clients get their hosted message/profile/report funded statuses from blockchain hosts. All communication between clients and payment proof servers must be encrypted with Nacl and Kyber.
Credit is represented as milligrams (change?) of gold within the account credit database.
Using cryptocurrency for funding content also allows for the timestamping of profiles/reports/messages. The blockchain becomes a source of truth for the earliest time at which a message can be proven to exist. If the sender-alleged message creation time conflicts with the payment proof funding time by more than an hour, a warning could be shown.
Credit can be purchased with cryptocurrency by destroying funds on the blockchain. The account credit interface servers can check an account's quantity of purchased credit by checking the balance of its associated blockchain address.
After a mate-profile/message/report is funded, its funded status is static, and its expiration time cannot be increased.
An advantage of using Credit is that it enables some users to join for free. The administrator of the account credit database can create credit by will. The administrator can send credit to trusted entities whose job is to distribute credit to people to onboard them to Seekia for free. For example, credit could be distributed by a faucet that requires a unique phone number, because phone numbers are costly to attain for spammers. Other examples include sending credit manually to users who have proof of personhood, sending credit to people who have verified social media accounts, etc..
### Privacy Risks
Credit can be transferred between users. A user shares their Account Identifier to another user, who can send credit to that identifier.
Payment proof providers pose a necessary privacy risk. The servers must be trusted to not keep track or log which account funded each profile/identity/message/report. If the proof provider servers were compromised over a period of time, they could be used to log the profiles/identities/messages/reports funded by each account. This would negate the privacy advantages of secret inboxes, making it easier to tell which users are talking to each other.
### Account Credit Servers
There is a single central account credit database, along with many account credit interface servers.
The account credit interface servers are used to load-balance all of the operations and bandwidth that do not need to be centrally performed. They also provide protection against hacks, because some account credit servers are read-only.
There are 2 types of interface servers: Read Only and Writeable. The read only servers are only able to read from the database. This will suffice for most requests. This will reduce the number of servers that, if compromised, would be able to corrupt the master database with false information.
The central database keeps track of each account's credit balance and each funded identity/profile/message/report's expiration time.
The database server is a single point of failure. It can be regularly backed up.
Each account is an ed25519 public/private key pair.
Each account public key is used to derive cryptocurrency addresses and an account identifier. See `/internal/network/accountKeys.go` for the implementation.
Each user can create as many accounts as they need. Cryptocurrency address and account identifier reuse is discouraged because of the privacy implications.
An account's public key is used to query the servers on the balance of the account. Without the private key, a requestor cannot determine the account credit balance. They must perform a handshake and sign something provided by the account server to verify they are the owner of the account.
Users can send funds from one account to another by using the account's identifier.
The interface servers communicate with the account credit database, deducting from the account's balance for each profile/message/report/identity funding transaction they make.
To buy credit using cryptocurrency, the user sends crypto to the address associated with their public key.
Any funds sent are destroyed. This is done for multiple reasons:
1. Technical: It is easiest to create a different address for each account this way.
2. Legal: to avoid any claims that Seekia is generating profit or acting as a for-profit entity.
3. Ideological: to keep Seekia as decentralized as possible. No single entity should profit from the users.
To spend funds, the user contacts an account credit interface server with their public key, their intended amount of credit to spend (in gold), and the message/profile/identity/report to fund. The account credit interface server derives the account public key's crypto addresses and looks up the deposits made to its addresses. The server multiplies the amount of crypto sent in each deposit by the gold exchange rate at deposit time described in the parameters to determine the total amount of credit in gold purchased for the account. The server then tells the database the crypto balance and the amount being spent and the item being funded.
An account's balance is the total amount of credit received via its identifier minus the total amount of credit spent. A balance can be negative if the user has only received credit to the account by purchasing via crypto.
The database checks if the amount deducted from the account is greater than or equal to the amount of credit purchased with crypto. If so, the transaction being made is rejected. If not, the amount being spent is subtracted from the account entry in the database, and the message/profile/identity/report funded status is updated within the database.
Each interface server must have access to:
1. The blockchain address deposits of each cryptocurrency
* They must be able to get the balance of any address, as well as the time and amount of each transaction
* All deposits in each block are combined into a single deposit for the block
* The servers can retrieve these deposits from different servers
2. The network parameters that determine the amount of gold to fund a message/profile/report/identity
3. The network parameters that determine the exchange rate for each cryptocurrency to gold.
* These rates must be historical and go back to the date of Seekia's launch.
Moderator identity scores do not rely on the account server. Moderators use crypto addresses derived from their identity hash. This prevents their balances from being lost if the server data is lost, which is much worse for moderators because the amount of money spent is much greater. Moderators should use blockchain privacy tools to fund their identity scores to avoid linking their crypto wallets with their moderator identity.
All communication between the database, the interface servers, hosts, and clients must be encrypted with Nacl and Kyber.
Another use of the servers could be timestamping of messages. The servers could be a source of truth for when a message was sent. If the sender-alleged message creation time conflicts with the account credit server by more than 1 minute, a warning could be shown. Otherwise, message creation times could be relied upon.
Hosts get their hosted message/profile/report funded statuses from the account credit servers. Mate/Moderator users get the message funded statuses from the credit servers, and the profile funded statuses from hosts. This is done to reduce the load on the account credit interface servers, and to enable the network to maintain more functionality if the account credit servers go offline.
After a profile/message/report is funded, its funded status is static, and its expiration time cannot be increased.
### Privacy Risk
The servers pose a necessary privacy risk. The servers must be trusted to not keep track or log which account funded each profile/identity/message/report. If the servers were compromised over a long period of time, they could be used to log the profiles/identities/messages/reports funded by each account. This would negate the privacy advantages of secret inboxes, making it trivial to tell which users are talking to each other.
If an attacker only obtained a snapshot of the servers, they would only learn the balances of each account. If the attacker could link the accounts to the Seekia users whom they belong to, and a user received all of their credit by purchasing with cryptocurrency, the attacker could tell how much credit the user has spent, and guess roughly how many messages the user sent. This is not possible if the user received credit for free or from another user, in which case, their credit balance would be subtracted from the database as it is spent. The amount of credit which belonged to an account in the past should not be saved or logged by the database, thus, the spent credit would disappear without a trace.
An attacker could potentially determine which exact messages were sent by a user. If the attacker linked a user's identity to their credit account(s) and balance(s), they could subtract the user's known identity/profile funding transaction amounts and determine which sent message/report costs add up exactly to the amount of funds spent. They could use information about the recipients of the messages to better guess that they had been sent by the suspected user. This becomes more difficult as more users join Seekia. Image messages should often cost the same amount, so this strategy should become impossible with enough users.
If an attacker only obtained a snapshot of the servers, they would only learn the balances of each account.
In any server-compromise scenario, the message contents would still be encrypted.
@ -370,80 +307,17 @@ In any server-compromise scenario, the message contents would still be encrypted
Another privacy consideration is the ability to link a user's identity hash to their account crypto address(es).
Account addresses will never withdraw funds, and will likely receive funds in the small amounts recommended by the Seekia client, making them easier to identify.
If a user funds their moderator score and payment proof provider account with the same Ethereum/Cardano wallet, then linking these addresses together is trivial.
If a user funds their moderator score and credit account with the same Ethereum/Cardano wallet, then linking these addresses together is trivial.
Another easy way to link identities to cryptocurrency addresses is to correlate the funding of payment proof provider addresses on the blockchain with the funding of new user identities/profiles on the Seekia network. This issue is mitigated by telling users in the GUI to wait a while after purchasing custodied cryptocurrency before broadcasting their profile for the first time. This breaks the link between their custodied cryptocurrency purchase and their identity/profile being funded.
Another easy way to link identities to addresses is to correlate the funding of account addresses on the blockchain with the appearance of new users profiles on the Seekia network. This issue is mitigated by telling users in the GUI to wait a while after purchasing credit before broadcasting their profile for the first time. This breaks the link between the identity being funded and their account crypto address.
Even if users are careful to prevent any links between their payment proof provider cryptocurrency transactions and their Seekia identity, observers may still be able to guess that the funds belong to some user of Seekia, because the addresses owned by payment proof providers will be easy to discover. Using blockchain analytics and user profile metadata, they could learn the wallet owner's real world identity.
Even if they are careful to prevent any links between their account crypto address and their Seekia identity, observers will still be able to guess that the funds belong to some user of Seekia. Using blockchain analytics and user profile metadata, they could learn the wallet owner's real world identity.
An attacker could potentially determine which exact messages were sent by a user. If the attacker observed the amount of cryptocurrency that a particular cryptocurrency address sent to a payment proof provider, they could subtract each user's known identity/profile funding transaction amounts and determine which sent message/report costs add up exactly to the amount of funds spent. They could use information about the recipients of the messages to better guess that they had been sent by the suspected user. This would allow attackers to link Seekia users to their cryptocurrency addresses and messages.
If user identities are linked to account crypto addresses, users who send from crypto wallets with large amounts of money could have their crypto wallet balances revealed to the world. This could cause them to become the victim of crime or be pursued by gold diggers. Users with large amounts of crypto should use privacy preserving technologies such as zero knowlege accumulators when purchasing Seekia credit. This warning is shown within the GUI.
If user identities are linked to account crypto addresses, users who send from crypto wallets with large amounts of money could have their crypto wallet balances revealed to the world. This could cause them to become victims of crime or be pursued by gold diggers. Users with large amounts of crypto should use privacy preserving technologies such as zero knowlege accumulators when increasing moderator identity scores or purchasing Seekia payment proof custodied cryptocurrency. This warning is shown within the GUI.
If a user funds the same account crypto address more than once, an observer can assume that the user has funded enough identities/profiles/messages/reports to drain at least the majority of their credit balance after their first deposit. This would allow the observer to guess that a specific user had sent a certain number of messages, which could be used to aid in other network analysis attacks. This issue is mitigated by discouraging address reuse and presenting the user with fresh crypto addresses whenever they want to purchase more credit.
As more people use Seekia and the number of Seekia transactions increase, these privacy risks are reduced.
### Account Credit Database Corruption
In the event of the database crashing or being hacked, the data could be corrupted.
The database server should be backed up regularly, but some data will likely be lost.
If the database is reset to an earlier state:
1. Any accounts funded with cryptocurrency will have their balances increase or stay the same.
2. Accounts funded via account identifiers will lose any money that was sent after the backup was made
* The account which sent the funds will have its balance restored
#### Account Credit Database Schema:
* Account Identifier `[14]byte` -> Balance (in milligrams of gold)
* This amount can be negative, if the account has purchased funds with cryptocurrency
* Sending from 1 account to another requires subtracting from the sender and adding to the recipient
* Message Hash `[26]byte` + MessageSize `int` -> ExpirationTime `int64` (unix)
* We need message size because requestor could lie about size, so each alleged size corresponds to its own entry
* Thus, the size of the message is required to get the isFunded status of a message
* The alternative requires uploading a message to the interface servers to fund it, which increases bandwidth dramatically
* Mate Profile Hash `[28]byte` -> ExpirationTime `int64` (unix)
* Identity Hash `[16]byte` -> ExpirationTime `int64` (unix)
* Identity Hash `[16]byte` -> Initial fund amount has been made `bool`
* This is needed to keep track of which identities have had their initial minimum fund amount satisfied
* Report Hash `[30]byte` -> ExpirationTime `int64` (unix)
#### Interface Servers Schema:
* MessageHash `[26]byte` + MessageSize `int` -> ExpirationTime `int64`
* Server only has to retrieve this once after it is funded, because time cannot be increased
* Mate Profile Hash `[28]byte` -> ExpirationTime `int64`
* Server only has to retrieve this once after it is funded, because time cannot be increased
* Report Hash `[30]byte` -> ExpirationTime `int64`
* Server only has to retrieve this once after it is funded, because time cannot be increased
* Identity Hash `[16]byte` -> ExpirationTime `int64`
* This is only needed for mate/host identities
* It must be updated with a background job, because a user may increase their identity expiration time using a different interface server
### A future without the servers
Once private cryptocurrency solutions can scale to our needed speed, the Seekia client can have its own crypto wallet that pays for each message/mate profile/report with a private transaction, and the account credit servers can be retired. Each message/mate profile/report/identity hash would have crypto addresses that are derived from its hash. These addresses would be used to burn coins, similarly to how moderator scores are funded.
It is possible that a more centralized high throughput blockchain could exist sooner that could support the necessary number of private transactions. It would be worth using this kind of system instead of the single-database option because it would be more decentralized.
Assuming each private transaction is 3KB, and there were 10,000 Seekia transactions per second, 30 MB would be added every second, or ~2.5 terabytes a day. At least some blockchain nodes would also have to verify the zero knowledge proofs, which would be resource intensive.
The blockchain could have a smart contract such as Tornado Cash Nova or Zcash Orchard that allows users to withdraw arbitrary amounts to addresses privately. Each transaction from the contract would be unlinkable.
In order for users to be able to create transactions, they would have to download the necessary information required to construct a zero knowledge proof that their coins came from some coin in a shielded pool. This would eventually become an enormous amount of data. There are several ways to reduce the burden of data to download:
1. Use many shielded pools. This would reduce the anonymity set, but it would be large enough for that to not matter. The Seekia application should choose one randomly for each user, so the user would only have to download changes to a single note tree.
2. Use a single shielded pool, but allow the user to only download a random portion of the shielded pool note tree, and construct a proof from this smaller anonymity set. I'm not sure if this is possible.
3. For a faster but more trusted method, there could be a way for the user to trust the blockchain provider to construct their transaction proof, without allowing the blockchain provider to steal their funds. I'm also not sure if this is possible. This would negate the need to download large amounts of data, but would require the blockchain provider to be trusted to not track which coins the user is spending, as this would reveal the messaging patterns of the user. This level of trust is already required for the account credit interface servers, which are operated by trusted entities.
To get funded statuses for identities/messages/profiles, hosts would connect to nodes which were hosting the balances of all transparent addresses, get the deposit information for the addresses that belong to the identities/messages/profiles, and use this deposit information and the network parameters to calculate the funded statuses.
If there were multiple cryptocurrencies, then multiple blockchains wallets would have to be supported within the app.
To maintain the advantage of onboarding people for free, a token would have to be created, which would require an admin to be able to mint tokens at will. This would create a marketplace for speculation, would be less decentralized, would make coins more difficult to purchase, and would introduce legal risks. I think it would be better to require all coins to be burned in the blockchain's native token. This could also be harmful by encouraging people to purchase a cryptocurrency which is centralized, so a warning must exist to discourage people from investing in the currency.
As more people use Seekia and the quantity of Seekia transactions increases, the anonymity set for each payment proof provider increases, and these privacy risks are reduced.
### Multiple Cryptocurrencies
@ -474,7 +348,7 @@ Outputs would have to be publicly burned, which would create many useless decoys
Using Monero in this way would also reduce the privacy of Seekia users. Each burned output's input decoys could more easily be traced to a user's real world identity, aided by the user's profile metadata.
Linking two consecutively burned outputs together would also be quite easy due to the limited number of decoys. An example would be if a user funds their credit account after funding their moderator identity. This is obviously an even greater problem on transparent blockchains like Ethereum, but Monero has an expectation of being private which we do not want to degrade.
Linking two consecutively burned outputs together would also be quite easy due to the limited number of decoys. An example would be if a user funds their payment proof provider account after funding their moderator identity. This is obviously an even greater problem on transparent blockchains like Ethereum, but Monero has an expectation of being private which we do not want to degrade.
The blockchain servers would also have to parse all the outputs with a public view key, which would be slower.
@ -564,11 +438,11 @@ Inter-IdentityType communication is forbidden. Mate users can only contact Mate
### Funding Messages
Each message must be funded with credit. The cost depends on the size and duration of the message.
Each message must be funded with cryptocurrency. The cost depends on the size and duration of the message.
Each message only has 2 options for duration: 2 days and 2 weeks. This makes all messages look more similar, reducing the possibility of linking a message fund duration to a particular sender.
Once a message is funded, its duration cannot be extended. This allows hosts and users to not have to make any more queries to the credit servers after they have confirmed that a message has been funded and retrieved its expiration time.
Once a message is funded, its duration cannot be extended. This allows hosts and users to not have to download any more of a message's payment proofs or make any more queries to blockchain hosts after they have confirmed that a message has been funded and retrieved its expiration time.
### Message Encryption Keys

View file

@ -11,9 +11,29 @@ There are features and changes to be made before Seekia is ready for launch.
Many tasks are not included here, but are instead annotated within the code with the **TODO** keyword.
### Account Credit Database and Servers
### Replace Account Credit with Merkle Tree Payment Proofs
See `Documentation.md` for the description of how this system should work.
Rather than using a central account credit server to fund content on the network, Seekia should use merkle-tree payment proofs.
A merkle tree payment proof provides a way to burn cryptocurrency funds in a single on-chain transaction for any quantity of identities, messages, reports, and profiles. Payment proof merkle trees enable a way to perform scalable proof-of-sacrifice/proof-of-burn.
Merkle tree payment proofs are needed to be able to support tens of thousands of payments per second, because creating an address and transaction to fund each piece of content and each identity would overload the blockchain(s) with too much on-chain data.
Payment proofs will be created and funded by Payment Proof Providers. These providers bundle payments from users into merkle trees.
Payment proof providers can be operated by anyone. They must be trusted not to log the payments made by each user. The Seekia admin(s) will share a list of trusted payment proof providers. Users will purchase cryptocurrency from them, which the providers will keep custody of. Payment proof providers can allow users to purchase custodied cryptocurrency via any payment method they desire, including bank transfers, credit cards, and cryptocurrency.
Payment proof providers may be required to register as exchanges/money transmitters, depending on their jurisdiction. Know Your Customer/Anti-money laundering laws may also be required to be obeyed in certain jurisdictions. A payment proof provider that does not allow any cryptocurrency to be withdrawn or traded may not be subject to these laws, as they are only allowing the user to purchase virtual custodied cryptocurrency. Similarly, online merchants which sell things for cryptocurrency are generally not required to get customer identification information, especially for small payment values. Payment proof providers can also make a profit from users by charging a small fee.
Payment proof providers will operate very similarly to [OpenTimestamps.org](https://www.opentimestamps.org) calendars. The major difference between them is that when using payment proof providers, users must pay to have their content included in the merkle tree. This requires payment proof providers to keep track of each user's balance, and provide a way for users to purchase custodied cryptocurrency.
If any of the payment proof providers are suddenly shut down, the payment proofs they created will still be valid. The users who purchased funds from them will lose any funds they had not already spent. The user's client will be able to switch to a new provider, and the user's balance will reset to 0.
A payment proof is a merkle tree path. The on-chain address for the payment proof is a hash of (The root of the merkle tree + A byte to represent the cryptocurrency). The cryptocurrency byte is included so that address is entirely different on each blockchain. The cryptocurrency value of each payment proof is calculated by dividing the amount of cryptocurrency initially sent to the merkle tree root's address by the number of layers between the root and the content's tree leaf node. Each leaf node is a hash of an identity hash, a profile hash, a message hash, or a report hash. A single merkle tree can contain multiple identical leaf nodes, allowing for each merkle tree to provide multiple payment proofs for the same piece of content/identity hash. A payment proof merkle tree can also be unbalanced, meaning that some leaves are closer to the root, and thus get to spend a larger portion of the total amount. For example, a 1 ETH merkle tree could distribute 0.5ETH to a top-level single leaf, 0.25ETH to a second-layer leaf, and 0.125 ETH to 2 third-level leaves.
The Seekia client will put scheduled user payments in a queue. The payment proof providers will wait for a certain time period to collect as many payments as possible, and then will combine them into a merkle tree and distribute the payment proofs to the users. The payment proofs are then broadcasted to the Seekia network by the users, along with the content that the proofs paid for. A single piece of content can be paid for with any quantity of payment proofs. There must be a minimum amount of cryptocurrency per payment proof to prevent spam.
Payment proofs allow the Seekia network to be fully decentralized, reducing the legal and technological attack surface of the Seekia network. Of course, nothing is "fully" decentralized, but at least there is no single server that the network relies upon.
### Importing Profiles
@ -269,9 +289,9 @@ There could be several analysis methods. These analysis methods will serve as an
Providing an open source ancestral analysis method is essential for race aware mate discovery technology to be credibly neutral. There already exist multiple open source ancestral analysis packages.
### Add Custom Type Illnesses
### Add Complex Disease Diagnosis
Many genetic illnesses are not able to be detected using the methods implemented in the `monogenicDiseases` or the `polygenicDiseases` packages.
Many genetic diseases are not able to be detected using the methods implemented in the `monogenicDiseases` or the `polygenicDiseases` packages.
Examples include diseases such as Fragile X and Turner's Syndrome.
@ -279,11 +299,7 @@ A new format called `complexDiseases` could be created.
Each disease can have a function that takes in a genome map and returns a diagnosis.
Many of these diseases may require additional data from the raw genome files that is not included in the genome map.
The `ReadRawGenomeFile` function should be able to read this relevant data.
The GUI would also have an accompanying set of pages to display these Custom illnesses.
The GUI would also have an accompanying set of pages to display complex diseases.
### Add Polygenic Disease Probability Risk
@ -293,23 +309,17 @@ Meaning, we want to tell the user the estimated probability that they will get a
Example: Normal risk = 5%, Your risk = 10%
We should be able to calculate this risk. We know the polygenic disease odds ratio of a base pair `(odds of disease with base pair)/(odds of disease with standard (common) base pair)`. We know the average probability of disease for the general population for each age period. We know the probability of each base pair for the general population.
This will be the most useful statistic for users trying to understand their polygenic disease risk.
Knowing that the probability of a particular type of cancer has increased by 10x is very different depending on the probability of getting the cancer.
Knowing that your risk score for a particular type of cancer is 10/10 is much less useful than understanding your probability of getting the cancer.
If the general population probability of getting cancer X is 5%, and the user's adjusted risk is 50%, that is a significant increase. However, if the general population risk is 0.1%, and the user's adjusted risk is 1%, then the user does not need to change their behavior or worry much.
### Add Neural Network Genetic Predictions
### Get Genetic Training Data
The current method for predicting polygenic disease risks and traits is not as informative and accurate as using neural nets.
We use neural networks to predict traits and polygenic diseases. We have to train these networks using example training data. This training data is a collection of people's genomes and the trait/polygenic disease information for each person.
Our current model adds and subtracts the likelihood values of various SNPs that are reported to have an effect on polygenic diseases and traits.
A much better method is to train a neural net to predict traits and polygenic diseases on a large number of genes. There are methods that exist to find the set of genes that have an effect on each trait/disease. For example, height is said to be effected by ~10,000 SNPs. Many GWAS studies exist which report which genes are responsible for certain traits and diseases. These are the genes to feed into the neural net for each trait/disease. These are also the genes that users will share in their profiles. I have already started to try to build this system. See `geneticPrediction.go` for an implementation of trait prediction using neural networks, and `createCoupleGeneticAnalysis.go` for information on how offspring predictions would work.
This method requires training data, which is largely unavailable for public use. We need fully open training data, not data that requires registration or permission to download.
Good training data is largely unavailable for public use. We need fully open training data, not data that requires registration or permission to download.
[OpenSNP.org](https://opensnp.org) is a free genomic data repository. OpenSNP relies on user submitted data, which can be falsified. OpenSNP should add a verification system so data provided by trustworthy people can be prioritized.
@ -319,10 +329,10 @@ Whoever collects the data needs to choose what data to collect from each person.
Some examples of data to collect:
* Collecting polygenic disease information would enable prediction of polygenic disease risk.
* Collecting polygenic disease information enables prediction of polygenic disease risk.
* Pictures and scans of participants faces would enable a genetic test for facial structure
* Personality tests would enable prediction of personality
* Measuring height would enable prediction of height
* Measuring height enables prediction of height
These kinds of genetic tests would allow parents to choose what their offspring will look like, their personality, and their intelligence.
@ -336,10 +346,12 @@ All of this is already possible, but will become easier with the proliferation o
### Add more diseases and traits
This task entails entering disease/trait SNP data from SNPedia.com and other sources. The bases have to be flipped if the orientation on SNPedia is minus. This requires flipping G/C and A/T. At least 3 people should check any added disease SNPs to ensure accuracy.
Adding monogenic diseases entails entering disease SNP data from SNPedia.com and other sources. The bases have to be flipped if the orientation on SNPedia is minus. This requires flipping G/C and A/T. At least 3 people should check any added disease SNPs to ensure accuracy.
This is a tedious data entry process with negative consequences if mistakes are made. Many users could falsely believe they have monogenic diseases, which could trigger mental health crises.
Adding polygenic diseases/traits requires training data and access to genome wide association studies. Seekia should also have the ability to perform genome wide association studies to find causal genes for traits.
### Interactive Map
Seekia should have an interactive world map. It would be similar to OpenStreetMaps, but with much less detail. It would only need to contain borders of countries and states as lines. It would be able to display latitude/longitude coordinates on the map as points.

View file

@ -107,10 +107,6 @@ The Value String section describes the map entry values encoded as String, which
* Value Bytes: Unicode bytes
* Value String: Unicode string
* Example: `https://seekia.net`
* **SeekiaTorWebsite**
* Description: .Onion address of the current Seekia website hidden service (shown in the GUI)
* Value Bytes: Unicode bytes
* Value String: Unicode string
* **SeekiaEthWebsite**
* Description: .eth address of the current Seekia website Ethereum Name Service IPFS site (shown in the GUI)
* Changing this is only needed if the seekia.eth site was seized by a nefarious entity.

View file

@ -124,7 +124,7 @@ Seekia is not reliant on proprietary mobile app stores. The Seekia application c
The genetic destiny of the human species should not be controlled by a small number of entities. Centralized mate discovery services can attempt to encourage certain kinds of relationships to form. For example, a nefarious mate discovery service could try to increase the prevalence of genetic disorders by encouraging relationships between people who have a higher probability of producing diseased offspring.
The Seekia network strives to be open and decentralized. The Seekia network aims to be resilient in the event that any host suddenly stops participating or is compromised by bad actors. Seekia is not fully decentralized, because it relies on a central credit database to perform scalable private accounting. Seekia still reaps many benefits from its decentralized architecture.
The Seekia network strives to be open and decentralized. The Seekia network aims to be resilient in the event that any host suddenly stops participating or is compromised by bad actors.
Anyone can participate as a network host, which involves serving profiles and messages to other network peers. It is impossible for a single host to prevent specific profiles and messages from reaching the rest of the network. Users broadcast and download content to and from multiple network hosts.
@ -282,19 +282,23 @@ Without any form of spam prevention, a single malicious actor could spam the See
Seekia requires users to fund their identities before broadcasting content to the network. Users must also fund each message, report, and mate profile.
In a fully decentralized model, users would use a cryptocurrency to fund each identity and piece of content. Cryptocurrency addresses would be derived from identity and content hashes. This approach requires using a decentralized cryptocurrency which can support tens of thousands of privacy-preserving transactions per second. I am not aware of any cryptocurrency which can support the necessary throughput, so a centralized accounting model is used instead.
Users use cryptocurrency to fund each identity and piece of content. A simple way to accomplish this is to derive cryptocurrency addresses from identity and content hashes, and to send funds to these address to destroy coins. This strategy would require at least one cryptocurrency transaction to fund each identity and piece of content, which would limit the activity on the Seekia network to the scaling capabilities of the utilized cryptocurrencies.
### Account Credit
### Payment Proofs
A centralized account credit database is used to facilitate the funding of content and identities on the Seekia network.
Payment proofs are used to enable the funding of many different identities and pieces of content in a single blockchain transaction.
Each account has a credit balance. An account is represented by a public/private key pair. Users must possess an account's private key to view and spend its balance. Credit can be purchased with cryptocurrency. Users can send credit from one account to another.
A payment proof is a merkle tree path. A payment proof merkle tree is a bundle of cryptographic hashes. Each leaf node in the tree is a hash of an identity hash or a content hash. The on-chain address for each payment proof is derived from the merkle tree's root. The value of the cryptocurrency sent to each merkle tree's blockchain address is distributed among the tree's leaf nodes.
The database is trusted to not log user behavior. If a snapshot of the database were ever leaked, sensitive information such as the senders of messages would not be revealed.
Payment proofs are created and funded by Payment Proof Providers. These providers bundle payments from users into merkle trees. Users can purchase virtual custodied cryptocurrency from each payment proof provider using cryptocurrency or other payment methods. Users use these funds to purchase payment proofs, which are broadcast to the Seekia network.
Using a central database allows for admins to freely create and distribute credit. Admins are able to onboard users to Seekia for free by sending them credit. A website could be created that allows users to receive credit by verifying ownership of a phone number. Phone numbers are costly for attackers to obtain.
If any payment proof providers are suddenly shut down, the payment proofs they created will still be valid. The users who purchased funds from them will lose any funds they had not already spent. User clients will be able to switch to a new provider, and user balances will reset to 0.
The account credit database is a single point of failure which the network relies upon. Creating backups of the database is prudent. If the database ever goes offline, hosts will continue to serve any content which has already been funded until the content expires from the network.
Payment proofs also provide a privacy advantage. Blockchain transactions can often be traced. Without payment proofs, the addresses where funds originate for each transaction could be traceable, allowing observers to trivially identify which messages were sent by the same identity. Payment proof providers are able to break the link between the purchasing of account funds and the purchasing of payment proofs for their users.
Payment proof providers are trusted to not log user behavior. If a snapshot of a non-logging payment proof provider's database were ever leaked, sensitive information such as the senders of messages would not be revealed.
Payment proofs also function as timestamps. A payment proof proves that the funded identity or content existed at the time of the payment.
## Messaging

2
go.mod
View file

@ -2,7 +2,7 @@ module seekia
replace seekia => ./
go 1.22
go 1.23
require (
fyne.io/fyne/v2 v2.5.0

View file

@ -2451,7 +2451,7 @@ func setBuildMateProfilePage_Tags(window fyne.Window, previousPage func()){
deleteButton := widget.NewButtonWithIcon("", theme.DeleteIcon(), func(){
newList, deletedAny := helpers.DeleteAllMatchingItemsFromStringList(myTagsList, tagName)
newList, deletedAny := helpers.DeleteAllMatchingItemsFromList(myTagsList, tagName)
if (deletedAny == false){
setErrorEncounteredPage(window, errors.New("Cannot delete tag: tag not found."), currentPage)
return

View file

@ -2445,7 +2445,7 @@ func setBuildMateProfilePage_EyeColor(window fyne.Window, previousPage func()){
getNewAttributeList := func()[]string{
if (newChoice == false){
newList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentEyeColorList, colorName)
newList, _ := helpers.DeleteAllMatchingItemsFromList(currentEyeColorList, colorName)
return newList
}
@ -2613,7 +2613,7 @@ func setBuildMateProfilePage_HairColor(window fyne.Window, previousPage func()){
getNewAttributeList := func()[]string{
if (newChoice == false){
newList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentHairColorList, colorName)
newList, _ := helpers.DeleteAllMatchingItemsFromList(currentHairColorList, colorName)
return newList
}

View file

@ -482,7 +482,7 @@ func setAddContactFromIdentityHashPage(window fyne.Window, userIdentityHash [16]
newContactCategoriesList := helpers.AddItemToStringListAndAvoidDuplicate(currentContactCategoriesList, categoryName)
contactCategoriesListBinding.Set(newContactCategoriesList)
} else {
newContactCategoriesList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentContactCategoriesList, categoryName)
newContactCategoriesList, _ := helpers.DeleteAllMatchingItemsFromList(currentContactCategoriesList, categoryName)
contactCategoriesListBinding.Set(newContactCategoriesList)
}
}
@ -785,8 +785,8 @@ func setEditContactCategoriesPage(window fyne.Window, contactIdentityHash [16]by
newCategoriesList := append(currentContactCategoriesList, categoryName)
return newCategoriesList
}
newCategoriesList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentContactCategoriesList, categoryName)
newCategoriesList, _ := helpers.DeleteAllMatchingItemsFromList(currentContactCategoriesList, categoryName)
return newCategoriesList
}

View file

@ -275,7 +275,7 @@ func setChooseDesiresPage_ProfileLanguage(window fyne.Window, previousPage func(
deleteLanguageButton := widget.NewButtonWithIcon("", theme.DeleteIcon(), func(){
newDesiredLanguagesList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentDesiredChoicesList, languageIdentifierBase64)
newDesiredLanguagesList, _ := helpers.DeleteAllMatchingItemsFromList(currentDesiredChoicesList, languageIdentifierBase64)
if (len(newDesiredLanguagesList) == 0){
@ -499,7 +499,7 @@ func setChooseDesiresPage_Country(window fyne.Window, previousPage func()){
deleteCountryButton := widget.NewButtonWithIcon("", theme.DeleteIcon(), func(){
newDesiredCountriesList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentDesiredChoicesList, countryIdentifierBase64)
newDesiredCountriesList, _ := helpers.DeleteAllMatchingItemsFromList(currentDesiredChoicesList, countryIdentifierBase64)
if (len(newDesiredCountriesList) == 0){
@ -805,7 +805,7 @@ func setChooseDesiresPage_SearchTerms(window fyne.Window, previousPage func()){
deleteTermButton := widget.NewButtonWithIcon("", theme.DeleteIcon(), func(){
newDesiredTermsList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentDesiredChoicesList, termNameBase64)
newDesiredTermsList, _ := helpers.DeleteAllMatchingItemsFromList(currentDesiredChoicesList, termNameBase64)
if (len(newDesiredTermsList) == 0){
@ -1287,7 +1287,7 @@ func getDesireEditor_Choice(window fyne.Window, currentPage func(), desireName s
return emptyList
}
newAttributeList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentDesiredChoicesList, "Other")
newAttributeList, _ := helpers.DeleteAllMatchingItemsFromList(currentDesiredChoicesList, "Other")
return newAttributeList
}

View file

@ -109,7 +109,7 @@ func setChooseDesiresPage_Language(window fyne.Window, previousPage func()){
if (response == false){
newList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentDesiredChoicesList, "Other")
newList, _ := helpers.DeleteAllMatchingItemsFromList(currentDesiredChoicesList, "Other")
return newList
}
@ -195,7 +195,8 @@ func setChooseDesiresPage_Language(window fyne.Window, previousPage func()){
languageNameLabel := getBoldLabelCentered(translate(languageName))
deleteLanguageButton := widget.NewButtonWithIcon("", theme.DeleteIcon(), func(){
newDesiredLanguagesList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentDesiredChoicesList, languageNameBase64)
newDesiredLanguagesList, _ := helpers.DeleteAllMatchingItemsFromList(currentDesiredChoicesList, languageNameBase64)
if (len(newDesiredLanguagesList) == 0){

View file

@ -1305,7 +1305,7 @@ func setChooseDesiresPage_23andMe_Haplogroup(window fyne.Window, maternalOrPater
return emptyList
}
newAttributeList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentDesiredChoicesList, "Other")
newAttributeList, _ := helpers.DeleteAllMatchingItemsFromList(currentDesiredChoicesList, "Other")
return newAttributeList
}
@ -1393,7 +1393,7 @@ func setChooseDesiresPage_23andMe_Haplogroup(window fyne.Window, maternalOrPater
deleteHaplogroupButton := widget.NewButtonWithIcon("", theme.DeleteIcon(), func(){
newDesiredHaplogroupsList, _ := helpers.DeleteAllMatchingItemsFromStringList(currentDesiredChoicesList, haplogroupNameBase64)
newDesiredHaplogroupsList, _ := helpers.DeleteAllMatchingItemsFromList(currentDesiredChoicesList, haplogroupNameBase64)
if (len(newDesiredHaplogroupsList) == 0){

View file

@ -679,7 +679,7 @@ func setHomePage(window fyne.Window){
//TODO: Retrieve URL from parameters
// URL may need to be changed if it is lost or stolen
// Also add a page that shows .onion and .eth URLs
// Also add a page that shows .eth URL
seekiaLink := getBoldLabel("Seekia.net")
seekiaVersion := getLabelCentered("Seekia Version 0.60")

View file

@ -569,7 +569,7 @@ func setPolygenicDiseaseLociExplainerPage(window fyne.Window, previousPage func(
description1 := getLabelCentered("Each polygenic disease has a set of associated genome loci.")
description2 := getLabelCentered("These are locations on the genome that can be tested to determine disease risk.")
description3 := getLabelCentered("The more loci that your genome contains, the more accurate your disease risk score will be.")
description3 := getLabelCentered("The more loci that your genome sequence contains, the more accurate your disease risk score will be.")
page := container.NewVBox(title, backButton, widget.NewSeparator(), subtitle, widget.NewSeparator(), description1, description2, description3)
@ -615,65 +615,6 @@ func setOffspringPolygenicDiseaseNumberOfLociTestedExplainerPage(window fyne.Win
}
func setPolygenicDiseaseLocusRiskWeightExplainerPage(window fyne.Window, previousPage func()){
title := getPageTitleCentered("Help - Locus Risk Weight")
backButton := getBackButtonCentered(previousPage)
subtitle := getPageSubtitleCentered("Locus Risk Weight")
description1 := getLabelCentered("A polygenic disease risk score is calculated by testing many locations on a genome.")
description2 := getLabelCentered("A genome will have a risk weight for each locus.")
description3 := getLabelCentered("A negative weight reduces the risk of the disease.")
description4 := getLabelCentered("A positive weight increases the risk of the disease.")
description5 := getLabelCentered("A 0 weight has no effect on the risk.")
page := container.NewVBox(title, backButton, widget.NewSeparator(), subtitle, widget.NewSeparator(), description1, description2, description3, description4, description5)
setPageContent(page, window)
}
func setOffspringPolygenicDiseaseLocusRiskWeightExplainerPage(window fyne.Window, previousPage func()){
title := getPageTitleCentered("Help - Locus Risk Weight")
backButton := getBackButtonCentered(previousPage)
subtitle := getPageSubtitleCentered("Offspring Locus Risk Weight")
description1 := getLabelCentered("A polygenic disease risk score is calculated by testing many locations on a genome.")
description2 := getLabelCentered("A genome will have a risk weight for each locus.")
description3 := getLabelCentered("A negative weight reduces the risk of the disease.")
description4 := getLabelCentered("A positive weight increases the risk of the disease.")
description5 := getLabelCentered("A 0 weight has no effect on the risk.")
description6 := getLabelCentered("An offspring's locus risk weight represents the average risk weight for all 4 possible locus outcomes.")
page := container.NewVBox(title, backButton, widget.NewSeparator(), subtitle, widget.NewSeparator(), description1, description2, description3, description4, description5, description6)
setPageContent(page, window)
}
func setPolygenicDiseaseLocusRiskWeightProbabilityExplainerPage(window fyne.Window, previousPage func()){
title := getPageTitleCentered("Help - Risk Weight Probability")
backButton := getBackButtonCentered(previousPage)
subtitle := getPageSubtitleCentered("Locus Risk Weight Probability")
description1 := getLabelCentered("A polygenic disease risk score is calculated by testing many locations on a genome.")
description2 := getLabelCentered("A genome will have a risk weight for each locus.")
description3 := getLabelCentered("A risk weight probability describes the probability of having that risk weight.")
description4 := getLabelCentered("For example, lets suppose a risk weight of 2 has a probability of 5%")
description5 := getLabelCentered("This means that 5% of people will have a risk weight of 2 at this locus.")
page := container.NewVBox(title, backButton, widget.NewSeparator(), subtitle, widget.NewSeparator(), description1, description2, description3, description4, description5)
setPageContent(page, window)
}
func setDiscreteTraitNeuralNetworkPredictionExplainerPage(window fyne.Window, previousPage func()){
title := getPageTitleCentered("Help - Neural Network Prediction")
@ -768,7 +709,7 @@ func setDiscreteTraitRulesExplainerPage(window fyne.Window, previousPage func())
backButton := getBackButtonCentered(previousPage)
subtitle := getPageSubtitleCentered("Trait Rules")
subtitle := getPageSubtitleCentered("Discrete Trait Rules")
description1 := getLabelCentered("Person genetic analyses contain discrete trait analyses.")
description2 := getLabelCentered("Discrete traits has multiple outcomes, and each outcome has an associated score.")

View file

@ -1329,7 +1329,7 @@ func setCreateCouplePage(window fyne.Window, previousPage func()){
return
}
newList, deletedAny := helpers.DeleteAllMatchingItemsFromStringList(existingList, personIdentifier)
newList, deletedAny := helpers.DeleteAllMatchingItemsFromList(existingList, personIdentifier)
if (deletedAny == false){
setErrorEncounteredPage(window, errors.New("Person not found when trying to delete person from chosen people list."), currentPage)
return

View file

@ -762,7 +762,7 @@ func setBrowseMatchesPage(window fyne.Window, previousPage func()){
currentAttributesList := strings.Split(currentAttributesListString, ",")
displayAttributesListPruned, _ := helpers.DeleteAllMatchingItemsFromStringList(currentAttributesList, currentSortByAttribute)
displayAttributesListPruned, _ := helpers.DeleteAllMatchingItemsFromList(currentAttributesList, currentSortByAttribute)
newDisplayAttributesList := []string{currentSortByAttribute}
@ -1024,23 +1024,44 @@ func setSelectMatchesSortByAttributePage(window fyne.Window, previousPage func()
err = addAttributeSelectButton("Physical", "OffspringTotalPolygenicDiseaseRiskScore", "Ascending")
if (err != nil) { return nil, err }
offspringLactoseToleranceProbabilityButton := widget.NewButton("Offspring Lactose Tolerance Probability", func(){
//TODO
showUnderConstructionDialog(window)
})
physicalAttributeButtonsGrid.Add(offspringLactoseToleranceProbabilityButton)
err = addAttributeSelectButton("Physical", "AutismRiskScore", "Ascending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringAutismRiskScore", "Ascending")
if (err != nil) { return nil, err }
offspringCurlyHairProbabilityButton := widget.NewButton("Offspring Curly Hair Probability", func(){
//TODO
showUnderConstructionDialog(window)
})
physicalAttributeButtonsGrid.Add(offspringCurlyHairProbabilityButton)
err = addAttributeSelectButton("Physical", "ObesityRiskScore", "Ascending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringObesityRiskScore", "Ascending")
if (err != nil) { return nil, err }
offspringStraightHairProbabilityButton := widget.NewButton("Offspring Straight Hair Probability", func(){
//TODO
showUnderConstructionDialog(window)
})
physicalAttributeButtonsGrid.Add(offspringStraightHairProbabilityButton)
err = addAttributeSelectButton("Physical", "OffspringBlueEyesProbability", "Descending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringGreenEyesProbability", "Descending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringHazelEyesProbability", "Descending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringBrownEyesProbability", "Descending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringLactoseToleranceProbability", "Descending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringStraightHairProbability", "Descending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringCurlyHairProbability", "Descending")
if (err != nil) { return nil, err }
// Numeric Traits:
err = addAttributeSelectButton("Physical", "HomosexualnessScore", "Ascending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringHomosexualnessScore", "Ascending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "PredictedHeight", "Descending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "OffspringPredictedHeight", "Descending")
if (err != nil) { return nil, err }
err = addAttributeSelectButton("Physical", "23andMe_NeanderthalVariants", "Descending")
if (err != nil) { return nil, err }
@ -1390,7 +1411,7 @@ func setCustomizeMatchDisplayPage(window fyne.Window, previousPage func()){
deleteAttributeButton := widget.NewButtonWithIcon("", theme.DeleteIcon(), func(){
newAttributesList, _ := helpers.DeleteAllMatchingItemsFromStringList(customDisplayAttributesList, attributeName)
newAttributesList, _ := helpers.DeleteAllMatchingItemsFromList(customDisplayAttributesList, attributeName)
if (len(newAttributesList) == 0){
@ -1555,7 +1576,6 @@ func setAddAttributeToCustomMatchDisplayPage(window fyne.Window, previousPage fu
"23andMe_MaternalHaplogroup",
"23andMe_PaternalHaplogroup",
"23andMe_NeanderthalVariants",
"OffspringProbabilityOfAnyMonogenicDisease",
"EyeColorSimilarity",
"EyeColorGenesSimilarity",
"HairColorSimilarity",
@ -1568,6 +1588,27 @@ func setAddAttributeToCustomMatchDisplayPage(window fyne.Window, previousPage fu
"23andMe_AncestralSimilarity",
"23andMe_MaternalHaplogroupSimilarity",
"23andMe_PaternalHaplogroupSimilarity",
"OffspringProbabilityOfAnyMonogenicDisease",
"TotalPolygenicDiseaseRiskScore",
"OffspringTotalPolygenicDiseaseRiskScore",
"AutismRiskScore",
"OffspringAutismRiskScore",
"ObesityRiskScore",
"OffspringObesityRiskScore",
"PredictedEyeColor",
"OffspringBlueEyesProbability",
"OffspringGreenEyesProbability",
"OffspringHazelEyesProbability",
"OffspringBrownEyesProbability",
"PredictedLactoseTolerance",
"OffspringLactoseToleranceProbability",
"PredictedHairTexture",
"OffspringStraightHairProbability",
"OffspringCurlyHairProbability",
"HomosexualnessScore",
"OffspringHomosexualnessScore",
"PredictedHeight",
"OffspringPredictedHeight",
}
lifestyleAttributeNamesList := []string{

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -1,4 +1,3 @@
package gui
// viewGeneticReferencesGui.go implements pages to display information about genetic diseases and traits
@ -245,200 +244,6 @@ func setViewPolygenicDiseaseDetailsPage(window fyne.Window, diseaseName string,
}
func setViewPolygenicDiseaseLocusDetailsPage(window fyne.Window, diseaseName string, locusIdentifier string, previousPage func()){
currentPage := func(){setViewPolygenicDiseaseLocusDetailsPage(window, diseaseName, locusIdentifier, previousPage)}
title := getPageTitleCentered("Viewing Locus Details")
backButton := getBackButtonCentered(previousPage)
locusObject, err := polygenicDiseases.GetPolygenicDiseaseLocusObject(diseaseName, locusIdentifier)
if (err != nil){
setErrorEncounteredPage(window, err, previousPage)
return
}
locusRSID := locusObject.LocusRSID
locusReferencesMap := locusObject.References
locusRSIDsList := []int64{locusRSID}
// We add aliases to locusRSIDsList
anyAliasesExist, rsidAliasesList, err := locusMetadata.GetRSIDAliases(locusRSID)
if (err != nil){
setErrorEncounteredPage(window, err, previousPage)
return
}
if (anyAliasesExist == true){
locusRSIDsList = append(locusRSIDsList, rsidAliasesList...)
}
metadataExists, locusMetadataObject, err := locusMetadata.GetLocusMetadata(locusRSID)
if (err != nil){
setErrorEncounteredPage(window, err, previousPage)
return
}
if (metadataExists == false){
setErrorEncounteredPage(window, errors.New("setViewPolygenicDiseaseLocusDetailsPage called with locusRSID missing from locusMetadata."), previousPage)
return
}
locusGeneName := locusMetadataObject.GeneNamesList[0]
diseaseNameLabel := widget.NewLabel("Disease Name:")
diseaseNameText := getBoldLabel(diseaseName)
diseaseNameRow := container.NewHBox(layout.NewSpacer(), diseaseNameLabel, diseaseNameText, layout.NewSpacer())
getLocusNamesLabelText := func()string{
if(len(locusRSIDsList) == 1){
return "Locus Name:"
}
return "Locus Names:"
}
locusNamesLabelText := getLocusNamesLabelText()
locusRSIDStringsList := make([]string, 0, len(locusRSIDsList))
for _, locusRSID := range locusRSIDsList{
locusRSIDString := helpers.ConvertInt64ToString(locusRSID)
locusRSIDName := "rs" + locusRSIDString
locusRSIDStringsList = append(locusRSIDStringsList, locusRSIDName)
}
locusNamesListString := strings.Join(locusRSIDStringsList, ", ")
locusNamesLabel := widget.NewLabel(locusNamesLabelText)
locusNamesText := getBoldLabel(locusNamesListString)
locusNamesRow := container.NewHBox(layout.NewSpacer(), locusNamesLabel, locusNamesText, layout.NewSpacer())
geneNameLabel := widget.NewLabel("Gene Name:")
geneNameText := getBoldLabel(locusGeneName)
geneNameRow := container.NewHBox(layout.NewSpacer(), geneNameLabel, geneNameText, layout.NewSpacer())
viewReferencesButton := getWidgetCentered(widget.NewButtonWithIcon("View References", theme.ListIcon(), func(){
setViewGeneticAnalysisReferencesPage(window, "Locus", locusReferencesMap, currentPage)
}))
getBasePairsGrid := func()(*fyne.Container, error){
locusRiskWeightsMap := locusObject.RiskWeightsMap
locusBasePairProbabilitiesMap := locusObject.BasePairProbabilitiesMap
riskWeightLabel := getItalicLabelCentered("Risk Weight")
probabilityLabel := getItalicLabelCentered("Probability Of Weight")
riskWeightColumn := container.NewVBox(riskWeightLabel, widget.NewSeparator())
riskWeightProbabilityColumn := container.NewVBox(probabilityLabel, widget.NewSeparator())
// We create a new map with duplicates removed
locusBasePairProbabilitiesMap_DuplicatesRemoved := make(map[string]float64)
for basePair, basePairProbability := range locusBasePairProbabilitiesMap{
baseA, baseB, semicolonFound := strings.Cut(basePair, ";")
if (semicolonFound == false) {
return nil, errors.New("Invalid base pair found in locusBasePairProbabilitiesMap: " + basePair)
}
basePairDuplicate := baseB + ";" + baseA
existingProbabilityValue, exists := locusBasePairProbabilitiesMap_DuplicatesRemoved[basePairDuplicate]
if (exists == true){
// The duplicate has already been added.
// We make sure the probability values match
if (existingProbabilityValue != basePairProbability){
return nil, errors.New("locusBasePairProbabilitiesMap contains duplicate base pair with different value")
}
continue
}
locusBasePairProbabilitiesMap_DuplicatesRemoved[basePair] = basePairProbability
}
// All probabilities are mutually exclusive (you can only have 1 base pair for each genome locus)
// Thus, we can add them together to get a total probability for each risk weight
// Map structure: Risk Weight -> Probability of having weight
riskWeightProbabilitiesMap := make(map[int]float64)
for basePair, basePairProbability := range locusBasePairProbabilitiesMap_DuplicatesRemoved{
getBasePairRiskWeight := func()int{
basePairRiskWeight, exists := locusRiskWeightsMap[basePair]
if (exists == false){
// This base pair has no known weight. We treat it as a 0 weight.
return 0
}
return basePairRiskWeight
}
basePairRiskWeight := getBasePairRiskWeight()
riskWeightProbabilitiesMap[basePairRiskWeight] += basePairProbability
}
// Now we sort risk weights in order of least to greatest
allRiskWeightsList := helpers.GetListOfMapKeys(riskWeightProbabilitiesMap)
slices.Sort(allRiskWeightsList)
for _, riskWeight := range allRiskWeightsList{
riskWeightProbability, exists := riskWeightProbabilitiesMap[riskWeight]
if (exists == false){
return nil, errors.New("Risk weight probability not found in riskWeightProbabilitiesMap")
}
riskWeightString := helpers.ConvertIntToString(riskWeight)
riskWeightPercentageProbability := riskWeightProbability * 100
riskWeightProbabilityString := helpers.ConvertFloat64ToStringRounded(riskWeightPercentageProbability, 2)
riskWeightProbabilityFormatted := "~" + riskWeightProbabilityString + "%"
riskWeightText := getBoldLabelCentered(riskWeightString)
riskWeightProbabilityText := getBoldLabelCentered(riskWeightProbabilityFormatted)
riskWeightColumn.Add(riskWeightText)
riskWeightProbabilityColumn.Add(riskWeightProbabilityText)
riskWeightColumn.Add(widget.NewSeparator())
riskWeightProbabilityColumn.Add(widget.NewSeparator())
}
riskWeightHelpButton := widget.NewButtonWithIcon("", theme.QuestionIcon(), func(){
setPolygenicDiseaseLocusRiskWeightExplainerPage(window, currentPage)
})
riskWeightColumn.Add(riskWeightHelpButton)
probabilityHelpButton := widget.NewButtonWithIcon("", theme.QuestionIcon(), func(){
setPolygenicDiseaseLocusRiskWeightProbabilityExplainerPage(window, currentPage)
})
riskWeightProbabilityColumn.Add(probabilityHelpButton)
basePairsGrid := container.NewHBox(layout.NewSpacer(), riskWeightColumn, riskWeightProbabilityColumn, layout.NewSpacer())
return basePairsGrid, nil
}
basePairsGrid, err := getBasePairsGrid()
if (err != nil){
setErrorEncounteredPage(window, err, previousPage)
return
}
page := container.NewVBox(title, backButton, widget.NewSeparator(), diseaseNameRow, widget.NewSeparator(), locusNamesRow, widget.NewSeparator(), geneNameRow, widget.NewSeparator(), viewReferencesButton, widget.NewSeparator(), basePairsGrid)
setPageContent(page, window)
}
func setViewGeneticAnalysisReferencesPage(window fyne.Window, referencesTopic string, referencesMap map[string]string, previousPage func()){
currentPage := func(){setViewGeneticAnalysisReferencesPage(window, referencesTopic, referencesMap, previousPage)}
@ -627,5 +432,3 @@ func setViewDiscreteTraitRuleDetailsPage(window fyne.Window, traitName string, r

File diff suppressed because it is too large Load diff

View file

@ -26,19 +26,19 @@ func ApplyCartoonEffect(inputImage image.Image, effectStrength int)(image.Image,
}
blurKernelSize, err := helpers.ScaleNumberProportionally(true, effectStrength, 0, 100, 1, 3)
blurKernelSize, err := helpers.ScaleIntProportionally(true, effectStrength, 0, 100, 1, 3)
if (err != nil) { return nil, err }
if (blurKernelSize % 2 == 0){
blurKernelSize += 1
}
edgeThreshold, err := helpers.ScaleNumberProportionally(false, effectStrength, 0, 100, 5, 200)
edgeThreshold, err := helpers.ScaleIntProportionally(false, effectStrength, 0, 100, 5, 200)
if (err != nil) { return nil, err }
oilFilterSize, err := helpers.ScaleNumberProportionally(true, effectStrength, 0, 100, 5, 20)
oilFilterSize, err := helpers.ScaleIntProportionally(true, effectStrength, 0, 100, 5, 20)
if (err != nil) { return nil, err }
oilLevels, err := helpers.ScaleNumberProportionally(true, effectStrength, 0, 100, 1, 3)
oilLevels, err := helpers.ScaleIntProportionally(true, effectStrength, 0, 100, 1, 3)
if (err != nil) { return nil, err }
options := CTOpts{
@ -78,7 +78,7 @@ func ApplyPencilEffect(inputImage image.Image, effectStrength int)(image.Image,
goeffectsImageObject, err := convertGolangImageObjectToGoeffectsImageObject(inputImage)
if (err != nil) { return nil, err }
blurAmount, err := helpers.ScaleNumberProportionally(true, effectStrength, 0, 100, 1, 20)
blurAmount, err := helpers.ScaleIntProportionally(true, effectStrength, 0, 100, 1, 20)
if (err != nil) { return nil, err }
if (blurAmount % 2 == 0) {
@ -119,7 +119,7 @@ func ApplyWireframeEffect(inputImage image.Image, effectStrength int, lightMode
grayscaleGoeffectsImage, err := grayscaleEffectObject.Apply(&goeffectsImageObject, 5)
if (err != nil) { return nil, err }
threshold, err := helpers.ScaleNumberProportionally(false, effectStrength, 0, 100, 10, 100)
threshold, err := helpers.ScaleIntProportionally(false, effectStrength, 0, 100, 10, 100)
if (err != nil) { return nil, err }
sobelEffectObject := NewSobel(threshold, lightMode)
@ -146,10 +146,10 @@ func ApplyOilPaintingEffect(inputImage image.Image, effectStrength int)(image.Im
return inputImage, nil
}
filterSize, err := helpers.ScaleNumberProportionally(true, effectStrength, 0, 100, 10, 30)
filterSize, err := helpers.ScaleIntProportionally(true, effectStrength, 0, 100, 10, 30)
if (err != nil) { return nil, err }
levels, err := helpers.ScaleNumberProportionally(true, effectStrength, 0, 100, 10, 70)
levels, err := helpers.ScaleIntProportionally(true, effectStrength, 0, 100, 10, 70)
if (err != nil) { return nil, err }
oilPaintingEffectObject := NewOilPainting(filterSize, levels)

View file

@ -12,6 +12,7 @@ import "seekia/resources/geneticReferences/locusMetadata"
import "seekia/resources/geneticReferences/monogenicDiseases"
import "seekia/resources/geneticReferences/polygenicDiseases"
import "seekia/resources/geneticReferences/traits"
import "seekia/resources/trainedPredictionModels"
import "seekia/resources/worldLanguages"
import "seekia/resources/worldLocations"
@ -395,9 +396,14 @@ func initializeApplicationVariables()error{
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
err = polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) { return err }
traits.InitializeTraitVariables()
err = traits.InitializeTraitVariables()
if (err != nil) { return err }
err = trainedPredictionModels.InitializeTrainedPredictionModels()
if (err != nil) { return err }
err = profileFormat.InitializeProfileFormatVariables()
if (err != nil) { return err }

View file

@ -945,9 +945,7 @@ func GetFakeProfile(profileType string, identityPublicKey [32]byte, identityPriv
diseaseLociList := diseaseObject.LociList
for _, locusObject := range diseaseLociList{
locusRSID := locusObject.LocusRSID
for _, locusRSID := range diseaseLociList{
shareableRSIDsMap[locusRSID] = struct{}{}
}
@ -963,7 +961,7 @@ func GetFakeProfile(profileType string, identityPublicKey [32]byte, identityPriv
rsidString := helpers.ConvertInt64ToString(rsID)
attributeName := "LocusValue_rs" + rsidString
locusValueAttributeName := "LocusValue_rs" + rsidString
baseA, err := helpers.GetRandomItemFromList(locusBasesList)
if (err != nil) { return nil, err }
@ -971,9 +969,17 @@ func GetFakeProfile(profileType string, identityPublicKey [32]byte, identityPriv
baseB, err := helpers.GetRandomItemFromList(locusBasesList)
if (err != nil) { return nil, err }
attributeValue := baseA + ";" + baseB
locusValueAttributeValue := baseA + ";" + baseB
profileMap[attributeName] = attributeValue
profileMap[locusValueAttributeName] = locusValueAttributeValue
locusIsPhasedAttributeName := "LocusIsPhased_rs" + rsidString
locusIsPhased := helpers.GetRandomBool()
locusIsPhasedString := helpers.ConvertBoolToYesOrNoString(locusIsPhased)
profileMap[locusIsPhasedAttributeName] = locusIsPhasedString
}
}
}

View file

@ -37,10 +37,18 @@ func TestGenerateParameters(t *testing.T){
func TestGenerateProfiles(t *testing.T){
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
traits.InitializeTraitVariables()
err := profileFormat.InitializeProfileFormatVariables()
err := polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = profileFormat.InitializeProfileFormatVariables()
if (err != nil) {
t.Fatalf("Failed to initialize profile format variables: " + err.Error())
}

View file

@ -12,7 +12,7 @@ package createCoupleGeneticAnalysis
// TODO: We want to eventually use neural nets for polygenic disease analysis (see geneticPrediction.go)
// This is only possible once we get access to the necessary training data
import "seekia/resources/geneticPredictionModels"
import "seekia/resources/trainedPredictionModels"
import "seekia/resources/geneticReferences/locusMetadata"
import "seekia/resources/geneticReferences/monogenicDiseases"
import "seekia/resources/geneticReferences/polygenicDiseases"
@ -27,7 +27,6 @@ import "seekia/internal/helpers"
import "errors"
import mathRand "math/rand/v2"
import "slices"
import "maps"
import "reflect"
@ -40,8 +39,8 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
person1PrepareRawGenomesUpdatePercentageCompleteFunction := func(newPercentage int)error{
newPercentageCompletion, err := helpers.ScaleNumberProportionally(true, newPercentage, 0, 100, 0, 25)
if (err != nil){ return err }
newPercentageCompletion, err := helpers.ScaleIntProportionally(true, newPercentage, 0, 100, 0, 25)
if (err != nil) { return err }
err = updatePercentageCompleteFunction(newPercentageCompletion)
if (err != nil) { return err }
@ -49,8 +48,18 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
return nil
}
person1GenomesWithMetadataList, allPerson1RawGenomeIdentifiersList, person1HasMultipleGenomes, person1OnlyExcludeConflictsGenomeIdentifier, person1OnlyIncludeSharedGenomeIdentifier, err := prepareRawGenomes.GetGenomesWithMetadataListFromRawGenomesList(person1GenomesList, person1PrepareRawGenomesUpdatePercentageCompleteFunction)
anyUsefulLocationsExist, person1GenomesWithMetadataList, allPerson1RawGenomeIdentifiersList, person1HasMultipleGenomes, person1OnlyExcludeConflictsGenomeIdentifier, person1OnlyIncludeSharedGenomeIdentifier, err := prepareRawGenomes.GetGenomesWithMetadataListFromRawGenomesList(person1GenomesList, person1PrepareRawGenomesUpdatePercentageCompleteFunction)
if (err != nil) { return false, "", err }
if (anyUsefulLocationsExist == false){
// We should have checked for this when genomes were first imported.
return false, "", errors.New("CreateCoupleGeneticAnalysis called with person1GenomesList that does not contain any useful genomes")
}
if (len(person1GenomesList) > 1 && (len(person1GenomesList) != (len(person1GenomesWithMetadataList)-2)) ){
// If there is more than 1 genome, 2 combined genomes are created
// We are checking to make sure that none of the input genomes were dropped due to not having any locations
// We should have checked to make sure each input genome has useful locations when each genome was first imported.
return false, "", errors.New("CreateCoupleGeneticAnalysis called with person1GenomesList containing at least 1 genome without useful locations.")
}
processIsStopped := checkIfProcessIsStopped()
if (processIsStopped == true){
@ -59,7 +68,7 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
person2PrepareRawGenomesUpdatePercentageCompleteFunction := func(newPercentage int)error{
newPercentageCompletion, err := helpers.ScaleNumberProportionally(true, newPercentage, 0, 100, 25, 50)
newPercentageCompletion, err := helpers.ScaleIntProportionally(true, newPercentage, 0, 100, 25, 50)
if (err != nil){ return err }
err = updatePercentageCompleteFunction(newPercentageCompletion)
@ -68,8 +77,18 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
return nil
}
person2GenomesWithMetadataList, allPerson2RawGenomeIdentifiersList, person2HasMultipleGenomes, person2OnlyExcludeConflictsGenomeIdentifier, person2OnlyIncludeSharedGenomeIdentifier, err := prepareRawGenomes.GetGenomesWithMetadataListFromRawGenomesList(person2GenomesList, person2PrepareRawGenomesUpdatePercentageCompleteFunction)
anyUsefulLocationsExist, person2GenomesWithMetadataList, allPerson2RawGenomeIdentifiersList, person2HasMultipleGenomes, person2OnlyExcludeConflictsGenomeIdentifier, person2OnlyIncludeSharedGenomeIdentifier, err := prepareRawGenomes.GetGenomesWithMetadataListFromRawGenomesList(person2GenomesList, person2PrepareRawGenomesUpdatePercentageCompleteFunction)
if (err != nil) { return false, "", err }
if (anyUsefulLocationsExist == false){
// We should have checked for this when genomes were first imported.
return false, "", errors.New("CreateCoupleGeneticAnalysis called with person2GenomesList that does not contain any useful genomes")
}
if (len(person2GenomesList) > 1 && (len(person2GenomesList) != (len(person2GenomesWithMetadataList)-2)) ){
// If there is more than 1 genome, 2 combined genomes are created
// We are checking to make sure that none of the input genomes were dropped due to not having any locations
// We should have checked to make sure each input genome has useful locations when each genome was first imported.
return false, "", errors.New("CreateCoupleGeneticAnalysis called with person2GenomesList containing at least 1 genome without useful locations.")
}
processIsStopped = checkIfProcessIsStopped()
if (processIsStopped == true){
@ -492,7 +511,6 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
for _, diseaseObject := range polygenicDiseaseObjectsList{
diseaseName := diseaseObject.DiseaseName
diseaseLociList := diseaseObject.LociList
// This map stores the polygenic disease info for each genome pair
// Map Structure: Genome Pair Identifier -> OffspringGenomePairPolygenicDiseaseInfo
@ -512,9 +530,13 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
return errors.New("addGenomePairDiseaseInfoToDiseaseMap called with unknown person2GenomeIdentifier.")
}
anyOffspringLocusTested, genomePairOffspringAverageRiskScore, quantityOfLociTested, genomePairOffspringDiseaseLociInfoMap, genomePairSampleOffspringRiskScoresList, err := GetOffspringPolygenicDiseaseInfo(diseaseLociList, person1LocusValuesMap, person2LocusValuesMap)
neuralNetworkExists, anyOffspringLocusKnown, offspringAverageRiskScore, accuracyRangesMap, predictedRiskScoresList, quantityOfLociKnown, quantityOfParentalPhasedLoci, err := GetOffspringPolygenicDiseaseAnalysis(diseaseObject, person1LocusValuesMap, person2LocusValuesMap)
if (err != nil) { return err }
if (anyOffspringLocusTested == false){
if (neuralNetworkExists == false){
// We cannot analyze this disease
return nil
}
if (anyOffspringLocusKnown == false){
// We have no information about this genome pair's disease risk
// We don't add this genome pair's disease info to the diseaseInfoMap
return nil
@ -522,10 +544,11 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
newOffspringGenomePairPolygenicDiseaseInfo := geneticAnalysis.OffspringGenomePairPolygenicDiseaseInfo{
QuantityOfLociTested: quantityOfLociTested,
OffspringAverageRiskScore: genomePairOffspringAverageRiskScore,
LociInfoMap: genomePairOffspringDiseaseLociInfoMap,
SampleOffspringRiskScoresList: genomePairSampleOffspringRiskScoresList,
OffspringAverageRiskScore: offspringAverageRiskScore,
PredictionConfidenceRangesMap: accuracyRangesMap,
QuantityOfLociKnown: quantityOfLociKnown,
QuantityOfParentalPhasedLoci: quantityOfParentalPhasedLoci,
SampleOffspringRiskScoresList: predictedRiskScoresList,
}
genomePairIdentifier := helpers.JoinTwo16ByteArrays(person1GenomeIdentifier, person2GenomeIdentifier)
@ -544,6 +567,11 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
if (err != nil) { return false, "", err }
}
if (len(offspringPolygenicDiseaseInfoMap) == 0){
// No disease analysis was performed
continue
}
newOffspringPolygenicDiseaseInfoObject := geneticAnalysis.OffspringPolygenicDiseaseInfo{
PolygenicDiseaseInfoMap: offspringPolygenicDiseaseInfoMap,
}
@ -624,7 +652,7 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
newOffspringGenomePairTraitInfo := geneticAnalysis.OffspringGenomePairDiscreteTraitInfo{}
neuralNetworkExists, neuralNetworkAnalysisExists, outcomeProbabilitiesMap, averagePredictionConfidence, quantityOfLociTested, quantityOfParentalPhasedLoci, err := GetOffspringDiscreteTraitInfo_NeuralNetwork(traitObject, person1LocusValuesMap, person2LocusValuesMap)
neuralNetworkExists, neuralNetworkAnalysisExists, outcomeProbabilitiesMap, averagePredictionConfidence, quantityOfLociTested, quantityOfParentalPhasedLoci, err := GetOffspringDiscreteTraitAnalysis_NeuralNetwork(traitObject, person1LocusValuesMap, person2LocusValuesMap)
if (err != nil) { return err }
if (neuralNetworkExists == true){
@ -645,7 +673,7 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
}
}
anyRulesExist, rulesAnalysisExists, quantityOfRulesTested, quantityOfLociKnown, offspringProbabilityOfPassingRulesMap, offspringOutcomeProbabilitiesMap, err := GetOffspringDiscreteTraitInfo_Rules(traitObject, person1LocusValuesMap, person2LocusValuesMap)
anyRulesExist, rulesAnalysisExists, quantityOfRulesTested, quantityOfLociKnown, offspringProbabilityOfPassingRulesMap, offspringOutcomeProbabilitiesMap, err := GetOffspringDiscreteTraitAnalysis_Rules(traitObject, person1LocusValuesMap, person2LocusValuesMap)
if (err != nil) { return err }
if (anyRulesExist == true){
@ -722,6 +750,104 @@ func CreateCoupleGeneticAnalysis(person1GenomesList []prepareRawGenomes.RawGenom
}
offspringDiscreteTraitsMap[traitName] = newOffspringTraitInfoObject
} else {
// traitIsDiscreteOrNumeric == "Numeric"
// This map stores the trait info for each genome pair
// Map Structure: Genome Pair Identifier -> OffspringGenomePairNumericTraitInfo
offspringTraitInfoMap := make(map[[32]byte]geneticAnalysis.OffspringGenomePairNumericTraitInfo)
// This will add the offspring trait information for the provided genome pair to the offspringTraitInfoMap
addGenomePairTraitInfoToOffspringMap := func(person1GenomeIdentifier [16]byte, person2GenomeIdentifier [16]byte)error{
person1LocusValuesMap, exists := person1GenomesMap[person1GenomeIdentifier]
if (exists == false){
return errors.New("addGenomePairTraitInfoToOffspringMap called with unknown person1GenomeIdentifier.")
}
person2LocusValuesMap, exists := person2GenomesMap[person2GenomeIdentifier]
if (exists == false){
return errors.New("addGenomePairTraitInfoToOffspringMap called with unknown person2GenomeIdentifier.")
}
neuralNetworkExists, neuralNetworkAnalysisExists, averageOutcome, predictionConfidenceRangesMap, sampleOffspringOutcomesList, quantityOfLociTested, quantityOfParentalPhasedLoci, err := GetOffspringNumericTraitAnalysis(traitObject, person1LocusValuesMap, person2LocusValuesMap)
if (err != nil) { return err }
if (neuralNetworkExists == false){
// Predictions are not possible for this trait
return nil
}
if (neuralNetworkAnalysisExists == false){
// No locations for this trait exists in which both user's genomes contain information
return nil
}
newOffspringGenomePairTraitInfo := geneticAnalysis.OffspringGenomePairNumericTraitInfo{
OffspringAverageOutcome: averageOutcome,
PredictionConfidenceRangesMap: predictionConfidenceRangesMap,
QuantityOfParentalPhasedLoci: quantityOfParentalPhasedLoci,
QuantityOfLociKnown: quantityOfLociTested,
SampleOffspringOutcomesList: sampleOffspringOutcomesList,
}
genomePairIdentifier := helpers.JoinTwo16ByteArrays(person1GenomeIdentifier, person2GenomeIdentifier)
offspringTraitInfoMap[genomePairIdentifier] = newOffspringGenomePairTraitInfo
return nil
}
err = addGenomePairTraitInfoToOffspringMap(pair1Person1GenomeIdentifier, pair1Person2GenomeIdentifier)
if (err != nil) { return false, "", err }
if (genomePair2Exists == true){
err := addGenomePairTraitInfoToOffspringMap(pair2Person1GenomeIdentifier, pair2Person2GenomeIdentifier)
if (err != nil) { return false, "", err }
}
newOffspringTraitInfoObject := geneticAnalysis.OffspringNumericTraitInfo{
TraitInfoMap: offspringTraitInfoMap,
}
if (len(offspringTraitInfoMap) >= 2){
// We check for conflicts
// Conflicts are only possible if two genome pairs exist with information about the trait
checkIfConflictExists := func()(bool, error){
// We check for conflicts between each genome pair's outcome scores and trait rules maps
genomePairTraitInfoObject := geneticAnalysis.OffspringGenomePairNumericTraitInfo{}
firstItemReached := false
for _, currentGenomePairTraitInfoObject := range offspringTraitInfoMap{
if (firstItemReached == false){
genomePairTraitInfoObject = currentGenomePairTraitInfoObject
firstItemReached = true
continue
}
areEqual := reflect.DeepEqual(genomePairTraitInfoObject, currentGenomePairTraitInfoObject)
if (areEqual == false){
return true, nil
}
}
return false, nil
}
conflictExists, err := checkIfConflictExists()
if (err != nil) { return false, "", err }
newOffspringTraitInfoObject.ConflictExists = conflictExists
}
offspringNumericTraitsMap[traitName] = newOffspringTraitInfoObject
}
}
@ -833,288 +959,109 @@ func GetOffspringMonogenicDiseaseProbabilities(dominantOrRecessive string, perso
return true, percentageProbabilityOffspringHasDiseaseInt, true, percentageProbabilityOffspringHasVariantInt, nil
}
// This is used to calculate user polygenic disease info for users
// It is faster to do it this way, because we don't create 100 prospective offspring
// We instead create 4 outcomes for each locus
// We can do this because testing each locus's risk score is independent of every other locus
// This is not true for traits, because trait rules are effected by multiple different loci
// When using the fast method for polygenic diseases, we don't get a sample of 100 offspring disease risk scores.
// The average risk score should still be the same for the fast and normal methods
// This function is also faster because we don't calculate odds ratio information or information about each locus
//Outputs:
// -bool: Any loci tested (if false, no offspring polygenic disease information is known)
// -int: Offspring Risk Score (Value between 0-10)
// -int: Number of loci tested
// -error
func GetOffspringPolygenicDiseaseInfo_Fast(diseaseLociList []polygenicDiseases.DiseaseLocus, person1LocusValuesMap map[int64]locusValue.LocusValue, person2LocusValuesMap map[int64]locusValue.LocusValue)(bool, int, int, error){
if (len(person1LocusValuesMap) == 0){
return false, 0, 0, nil
}
if (len(person2LocusValuesMap) == 0){
return false, 0, 0, nil
}
// I = Insertion
// D = Deletion
validAllelesList := []string{"C", "A", "T", "G", "I", "D"}
numberOfLociTested := 0
offspringSummedRiskWeights := 0
offspringMinimumPossibleRiskWeightSum := 0
offspringMaximumPossibleRiskWeightSum := 0
for _, locusObject := range diseaseLociList{
locusRSID := locusObject.LocusRSID
locusRiskWeightsMap := locusObject.RiskWeightsMap
locusMinimumWeight := locusObject.MinimumRiskWeight
locusMaximumWeight := locusObject.MaximumRiskWeight
person1LocusValueFound, person1LocusBase1Value, person1LocusBase2Value, _, _, err := createPersonGeneticAnalysis.GetLocusValueFromGenomeMap(true, person1LocusValuesMap, locusRSID)
if (err != nil) { return false, 0, 0, err }
if (person1LocusValueFound == false){
// None of the offspring will have a value for this locus
continue
}
person2LocusValueFound, person2LocusBase1Value, person2LocusBase2Value, _, _, err := createPersonGeneticAnalysis.GetLocusValueFromGenomeMap(true, person2LocusValuesMap, locusRSID)
if (err != nil) { return false, 0, 0, err }
if (person2LocusValueFound == false){
// None of the offspring will have a value for this locus
continue
}
baseIsValid := slices.Contains(validAllelesList, person1LocusBase1Value)
if (baseIsValid == false){
return false, 0, 0, errors.New("GetOffspringPolygenicDiseaseInfo_Fast called with genomeMap containing invalid locus value base: " + person1LocusBase1Value)
}
baseIsValid = slices.Contains(validAllelesList, person1LocusBase2Value)
if (baseIsValid == false){
return false, 0, 0, errors.New("GetOffspringPolygenicDiseaseInfo_Fast called with genomeMap containing invalid locus value base: " + person1LocusBase2Value)
}
baseIsValid = slices.Contains(validAllelesList, person2LocusBase1Value)
if (baseIsValid == false){
return false, 0, 0, errors.New("GetOffspringPolygenicDiseaseInfo_Fast called with genomeMap containing invalid locus value base: " + person2LocusBase1Value)
}
baseIsValid = slices.Contains(validAllelesList, person2LocusBase2Value)
if (baseIsValid == false){
return false, 0, 0, errors.New("GetOffspringPolygenicDiseaseInfo_Fast called with genomeMap containing invalid locus value base: " + person2LocusBase2Value)
}
numberOfLociTested += 1
offspringBasePairOutcome1 := person1LocusBase1Value + ";" + person2LocusBase1Value
offspringBasePairOutcome2 := person1LocusBase2Value + ";" + person2LocusBase2Value
offspringBasePairOutcome3 := person1LocusBase1Value + ";" + person2LocusBase2Value
offspringBasePairOutcome4 := person1LocusBase2Value + ";" + person2LocusBase1Value
baseOutcomesList := []string{offspringBasePairOutcome1, offspringBasePairOutcome2, offspringBasePairOutcome3, offspringBasePairOutcome4}
outcomesSummedRiskWeight := 0
for _, outcomeBasePair := range baseOutcomesList{
offspringOutcomeRiskWeight, exists := locusRiskWeightsMap[outcomeBasePair]
if (exists == false){
// We do not know the risk weight for this base pair
// We treat this as a 0 risk weight
continue
}
outcomesSummedRiskWeight += offspringOutcomeRiskWeight
}
locusAverageRiskWeight := outcomesSummedRiskWeight/4
offspringSummedRiskWeights += locusAverageRiskWeight
offspringMinimumPossibleRiskWeightSum += locusMinimumWeight
offspringMaximumPossibleRiskWeightSum += locusMaximumWeight
}
offspringAverageDiseaseRiskScore, err := helpers.ScaleNumberProportionally(true, offspringSummedRiskWeights, offspringMinimumPossibleRiskWeightSum, offspringMaximumPossibleRiskWeightSum, 0, 10)
if (err != nil) { return false, 0, 0, err }
if (numberOfLociTested == 0){
// No locations were tested
return false, 0, 0, nil
}
return true, offspringAverageDiseaseRiskScore, numberOfLociTested, nil
}
//Outputs:
// -bool: Any loci tested (if false, no offspring polygenic disease information is known)
// -int: Offspring Risk Score (Value between 0-10)
// -int: Number of loci tested
// -map[[3]byte]geneticAnalysis.OffspringPolygenicDiseaseLocusInfo: Offspring Locus information map
// Map Structure: Locus identifier -> OffspringPolygenicDiseaseLocusInfo
// -bool: A neural network exists for this trait
// -bool: Any loci tested (if false, no offspring polygenic disease analysis is known)
// -int: Offspring Average Risk Score (Value between 0-10)
// -map[int]float64: Prediction accuracy ranges map
// -Map Structure: Probability prediction is accurate (X) -> Distance from prediction that must be travelled in both directions to
// create a range in which the true value will fall into, X% of the time
// -[]int: Sample offspring risks scores list
// -int: Quantity of loci known
// -int: Quantity of parental phased loci
// -error
func GetOffspringPolygenicDiseaseInfo(diseaseLociList []polygenicDiseases.DiseaseLocus, person1LocusValuesMap map[int64]locusValue.LocusValue, person2LocusValuesMap map[int64]locusValue.LocusValue)(bool, int, int, map[[3]byte]geneticAnalysis.OffspringPolygenicDiseaseLocusInfo, []int, error){
func GetOffspringPolygenicDiseaseAnalysis(diseaseObject polygenicDiseases.PolygenicDisease, person1LocusValuesMap map[int64]locusValue.LocusValue, person2LocusValuesMap map[int64]locusValue.LocusValue)(bool, bool, int, map[int]float64, []int, int, int, error){
diseaseName := diseaseObject.DiseaseName
modelExists := trainedPredictionModels.CheckIfAttributeNeuralNetworkExists(diseaseName)
if (modelExists == false){
// Prediction is not possible for this trait
return false, false, 0, nil, nil, 0, 0, nil
}
if (len(person1LocusValuesMap) == 0){
return false, 0, 0, nil, nil, nil
return true, false, 0, nil, nil, 0, 0, nil
}
if (len(person2LocusValuesMap) == 0){
return false, 0, 0, nil, nil, nil
return true, false, 0, nil, nil, 0, 0, nil
}
// First, we create 100 prospective offspring genomes.
diseaseLociList := diseaseObject.LociList
diseaseLociRSIDsList := make([]int64, 0)
// First we count up the quantity of parental phased loci
// We only count the quantity of phased loci for loci which are known for both parents
for _, diseaseLocusObject := range diseaseLociList{
quantityOfParentalPhasedLoci := 0
locusRSID := diseaseLocusObject.LocusRSID
diseaseLociRSIDsList = append(diseaseLociRSIDsList, locusRSID)
for _, rsID := range diseaseLociList{
person1LocusValue, exists := person1LocusValuesMap[rsID]
if (exists == false){
continue
}
person2LocusValue, exists := person2LocusValuesMap[rsID]
if (exists == false){
continue
}
person1LocusIsPhased := person1LocusValue.LocusIsPhased
if (person1LocusIsPhased == true){
quantityOfParentalPhasedLoci += 1
}
person2LocusIsPhased := person2LocusValue.LocusIsPhased
if (person2LocusIsPhased == true){
quantityOfParentalPhasedLoci += 1
}
}
anyLocusValueExists, prospectiveOffspringGenomesList, err := getProspectiveOffspringGenomesList(diseaseLociRSIDsList, person1LocusValuesMap, person2LocusValuesMap)
if (err != nil) { return false, 0, 0, nil, nil, err }
// We create 100 prospective offspring genomes.
anyLocusValueExists, prospectiveOffspringGenomesList, err := getProspectiveOffspringGenomesList(diseaseLociList, person1LocusValuesMap, person2LocusValuesMap)
if (err != nil) { return false, false, 0, nil, nil, 0, 0, err }
if (anyLocusValueExists == false){
return false, 0, 0, nil, nil, nil
return true, false, 0, nil, nil, 0, 0, nil
}
// This will sum every offspring's average disease risk score
offspringAverageRiskScoreSum := 0
// A list of predicted risk scores for each offspring
predictedRiskScoresList := make([]int, 0)
// This stores a list of every prospective offspring's risk score
sampleOffspringRiskScoresList := make([]int, 0)
accuracyRangesMap := make(map[int]float64)
quantityOfLociTested := 0
type offspringSummedLocusInfoObject struct{
for index, offspringGenomeMap := range prospectiveOffspringGenomesList{
SummedLocusRiskWeights int
SummedOddsRatios float64
NumberOfSummedOddsRatios int
// This is the number of unknown-odds-ratio-weight sums that we summed up
NumberOfUnknownOddsRatioWeightSums int
// This is the sum of every unknown-odds-ratio-weight-sum for each prospective offspring for this genome
UnknownOddsRatioWeightSumsSummed int
}
// Map Structure: Locus Identifier -> offspringSummedLocusInfoObject
offspringLocusInfoSumsMap := make(map[[3]byte]offspringSummedLocusInfoObject)
for offspringIndex, offspringGenomeMap := range prospectiveOffspringGenomesList{
offspringSummedRiskWeights := 0
offspringMinimumPossibleRiskWeightSum := 0
offspringMaximumPossibleRiskWeightSum := 0
for _, locusObject := range diseaseLociList{
locusIdentifierHex := locusObject.LocusIdentifier
locusIdentifier, err := encoding.DecodeHexStringTo3ByteArray(locusIdentifierHex)
if (err != nil) { return false, 0, 0, nil, nil, err }
offspringLocusInfoSumsObject, exists := offspringLocusInfoSumsMap[locusIdentifier]
if (exists == false){
if (offspringIndex != 0){
// We already checked a previous offspring for this locus, and it's value doesn't exist
continue
}
}
locusRSID := locusObject.LocusRSID
locusRiskWeightsMap := locusObject.RiskWeightsMap
locusOddsRatiosMap := locusObject.OddsRatiosMap
locusMinimumWeight := locusObject.MinimumRiskWeight
locusMaximumWeight := locusObject.MaximumRiskWeight
basePairValueFound, locusBase1Value, locusBase2Value, _, _, err := createPersonGeneticAnalysis.GetLocusValueFromGenomeMap(true, offspringGenomeMap, locusRSID)
if (err != nil) { return false, 0, 0, nil, nil, err }
if (basePairValueFound == false){
// None of the offspring will have a value for this locus
continue
}
locusRiskWeight, locusOddsRatioIsKnown, locusOddsRatio, err := createPersonGeneticAnalysis.GetGenomePolygenicDiseaseLocusRiskInfo(locusRiskWeightsMap, locusOddsRatiosMap, locusBase1Value, locusBase2Value)
if (err != nil) { return false, 0, 0, nil, nil, err }
offspringLocusInfoSumsObject.SummedLocusRiskWeights += locusRiskWeight
if (locusOddsRatioIsKnown == true){
offspringLocusInfoSumsObject.SummedOddsRatios += locusOddsRatio
offspringLocusInfoSumsObject.NumberOfSummedOddsRatios += 1
} else {
offspringLocusInfoSumsObject.UnknownOddsRatioWeightSumsSummed += locusRiskWeight
offspringLocusInfoSumsObject.NumberOfUnknownOddsRatioWeightSums += 1
}
offspringLocusInfoSumsMap[locusIdentifier] = offspringLocusInfoSumsObject
offspringSummedRiskWeights += locusRiskWeight
offspringMinimumPossibleRiskWeightSum += locusMinimumWeight
offspringMaximumPossibleRiskWeightSum += locusMaximumWeight
neuralNetworkExists, predictionIsKnown, predictedRiskScore, predictionAccuracyRangesMap, currentQuantityOfLociTested, _, err := createPersonGeneticAnalysis.GetPersonGenomePolygenicDiseaseAnalysis(diseaseObject, offspringGenomeMap, false)
if (err != nil){ return false, false, 0, nil, nil, 0, 0, err }
if (neuralNetworkExists == false){
return false, false, 0, nil, nil, 0, 0, errors.New("GetGenomeNumericTraitAnalysis claiming that neural network doesn't exist when we already checked.")
}
if (predictionIsKnown == false){
return false, false, 0, nil, nil, 0, 0, errors.New("GetGenomeNumericTraitAnalysis claiming that prediction is impossible when we already know at least 1 locus value exists for trait.")
}
offspringAverageDiseaseRiskScore, err := helpers.ScaleNumberProportionally(true, offspringSummedRiskWeights, offspringMinimumPossibleRiskWeightSum, offspringMaximumPossibleRiskWeightSum, 0, 10)
if (err != nil) { return false, 0, 0, nil, nil, err }
predictedRiskScoresList = append(predictedRiskScoresList, predictedRiskScore)
sampleOffspringRiskScoresList = append(sampleOffspringRiskScoresList, offspringAverageDiseaseRiskScore)
offspringAverageRiskScoreSum += offspringAverageDiseaseRiskScore
if (index == 0){
// These values should be the same for each predicted offspring
accuracyRangesMap = predictionAccuracyRangesMap
quantityOfLociTested = currentQuantityOfLociTested
}
}
numberOfLociTested := len(offspringLocusInfoSumsMap)
// We calculate the average predicted risk score
if (numberOfLociTested == 0){
// No locations were tested
return false, 0, 0, nil, nil, nil
outcomesSum := 0
for _, predictedRiskScore := range predictedRiskScoresList{
outcomesSum += predictedRiskScore
}
offspringAverageRiskScore := offspringAverageRiskScoreSum/100
averageRiskScore := outcomesSum/100
// Map Structure: Locus Identifier -> OffspringPolygenicDiseaseLocusInfo
offspringDiseaseLociInfoMap := make(map[[3]byte]geneticAnalysis.OffspringPolygenicDiseaseLocusInfo)
for locusIdentifier, summedLocusInfoObject := range offspringLocusInfoSumsMap{
summedLocusRiskWeights := summedLocusInfoObject.SummedLocusRiskWeights
summedOddsRatios := summedLocusInfoObject.SummedOddsRatios
numberOfSummedOddsRatios := summedLocusInfoObject.NumberOfSummedOddsRatios
numberOfUnknownOddsRatioWeightSums := summedLocusInfoObject.NumberOfUnknownOddsRatioWeightSums
unknownOddsRatioWeightSumsSummed := summedLocusInfoObject.UnknownOddsRatioWeightSumsSummed
// There are 100 prospective offspring, so we divide by 100
locusAverageRiskWeight := summedLocusRiskWeights/100
newLocusInfoObject := geneticAnalysis.OffspringPolygenicDiseaseLocusInfo{
OffspringAverageRiskWeight: locusAverageRiskWeight,
}
if (numberOfSummedOddsRatios != 0){
newLocusInfoObject.OffspringOddsRatioIsKnown = true
offspringAverageOddsRatio := summedOddsRatios/float64(numberOfSummedOddsRatios)
newLocusInfoObject.OffspringAverageOddsRatio = offspringAverageOddsRatio
}
if (numberOfUnknownOddsRatioWeightSums != 0){
offspringAverageUnknownOddsRatiosWeightSum := unknownOddsRatioWeightSumsSummed/numberOfUnknownOddsRatioWeightSums
newLocusInfoObject.OffspringAverageUnknownOddsRatiosWeightSum = offspringAverageUnknownOddsRatiosWeightSum
}
offspringDiseaseLociInfoMap[locusIdentifier] = newLocusInfoObject
}
return true, offspringAverageRiskScore, numberOfLociTested, offspringDiseaseLociInfoMap, sampleOffspringRiskScoresList, nil
return true, true, averageRiskScore, accuracyRangesMap, predictedRiskScoresList, quantityOfLociTested, quantityOfParentalPhasedLoci, nil
}
@ -1127,16 +1074,16 @@ func GetOffspringPolygenicDiseaseInfo(diseaseLociList []polygenicDiseases.Diseas
// -int: Quantity of loci tested
// -int: Quantity of parental phased loci
// -error
func GetOffspringDiscreteTraitInfo_NeuralNetwork(traitObject traits.Trait, person1LocusValuesMap map[int64]locusValue.LocusValue, person2LocusValuesMap map[int64]locusValue.LocusValue)(bool, bool, map[string]int, int, int, int, error){
func GetOffspringDiscreteTraitAnalysis_NeuralNetwork(traitObject traits.Trait, person1LocusValuesMap map[int64]locusValue.LocusValue, person2LocusValuesMap map[int64]locusValue.LocusValue)(bool, bool, map[string]int, int, int, int, error){
traitName := traitObject.TraitName
traitIsDiscreteOrNumeric := traitObject.DiscreteOrNumeric
if (traitIsDiscreteOrNumeric != "Discrete"){
return false, false, nil, 0, 0, 0, errors.New("GetOffspringDiscreteTraitInfo_NeuralNetwork called with non-discrete trait.")
return false, false, nil, 0, 0, 0, errors.New("GetOffspringDiscreteTraitAnalysis_NeuralNetwork called with non-discrete trait.")
}
modelExists, _ := geneticPredictionModels.GetGeneticPredictionModelBytes(traitName)
modelExists := trainedPredictionModels.CheckIfAttributeNeuralNetworkExists(traitName)
if (modelExists == false){
// Neural network prediction is not possible for this trait
return false, false, nil, 0, 0, 0, nil
@ -1225,7 +1172,7 @@ func GetOffspringDiscreteTraitInfo_NeuralNetwork(traitObject traits.Trait, perso
// -map[string]int: Offspring outcome probabilities map
// Map Structure: Outcome Name -> Offspring probability of outcome (0-100)
// -error
func GetOffspringDiscreteTraitInfo_Rules(traitObject traits.Trait, person1LocusValuesMap map[int64]locusValue.LocusValue, person2LocusValuesMap map[int64]locusValue.LocusValue)(bool, bool, int, int, map[[3]byte]int, map[string]int, error){
func GetOffspringDiscreteTraitAnalysis_Rules(traitObject traits.Trait, person1LocusValuesMap map[int64]locusValue.LocusValue, person2LocusValuesMap map[int64]locusValue.LocusValue)(bool, bool, int, int, map[[3]byte]int, map[string]int, error){
traitRulesList := traitObject.RulesList
@ -1303,6 +1250,109 @@ func GetOffspringDiscreteTraitInfo_Rules(traitObject traits.Trait, person1LocusV
}
//Outputs:
// -bool: A neural network exists for this trait
// -bool: Analysis exists (at least 1 locus exists for this analysis from both people's genomes
// -float64: Average outcome for offspring
// -map[int]float64: Prediction accuracy ranges map
// -Map Structure: Probability prediction is accurate (X) -> Distance from predictoin that must be travelled in both directions to
// create a range in which the true value will fall into, X% of the time
// -[]float64: A list of 100 offspring outcomes
// -int: Quantity of loci known
// -int: Quantity of parental phased loci
// -error
func GetOffspringNumericTraitAnalysis(traitObject traits.Trait, person1LocusValuesMap map[int64]locusValue.LocusValue, person2LocusValuesMap map[int64]locusValue.LocusValue)(bool, bool, float64, map[int]float64, []float64, int, int, error){
traitName := traitObject.TraitName
traitIsDiscreteOrNumeric := traitObject.DiscreteOrNumeric
if (traitIsDiscreteOrNumeric != "Numeric"){
return false, false, 0, nil, nil, 0, 0, errors.New("GetOffspringNumericTraitAnalysis called with non-numeric trait.")
}
modelExists := trainedPredictionModels.CheckIfAttributeNeuralNetworkExists(traitName)
if (modelExists == false){
// Prediction is not possible for this trait
return false, false, 0, nil, nil, 0, 0, nil
}
traitLociList := traitObject.LociList
// First we count up the quantity of parental phased loci
// We only count the quantity of phased loci for loci which are known for both parents
quantityOfParentalPhasedLoci := 0
for _, rsID := range traitLociList{
person1LocusValue, exists := person1LocusValuesMap[rsID]
if (exists == false){
continue
}
person2LocusValue, exists := person2LocusValuesMap[rsID]
if (exists == false){
continue
}
person1LocusIsPhased := person1LocusValue.LocusIsPhased
if (person1LocusIsPhased == true){
quantityOfParentalPhasedLoci += 1
}
person2LocusIsPhased := person2LocusValue.LocusIsPhased
if (person2LocusIsPhased == true){
quantityOfParentalPhasedLoci += 1
}
}
// Next, we create 100 prospective offspring genomes.
anyLocusValueExists, prospectiveOffspringGenomesList, err := getProspectiveOffspringGenomesList(traitLociList, person1LocusValuesMap, person2LocusValuesMap)
if (err != nil) { return false, false, 0, nil, nil, 0, 0, err }
if (anyLocusValueExists == false){
return true, false, 0, nil, nil, 0, 0, nil
}
// A list of predicted outcomes for each offspring
predictedOutcomesList := make([]float64, 0)
accuracyRangesMap := make(map[int]float64)
quantityOfLociTested := 0
for index, offspringGenomeMap := range prospectiveOffspringGenomesList{
neuralNetworkExists, predictionIsKnown, predictedOutcome, predictionAccuracyRangesMap, currentQuantityOfLociTested, _, err := createPersonGeneticAnalysis.GetGenomeNumericTraitAnalysis(traitObject, offspringGenomeMap, false)
if (err != nil){ return false, false, 0, nil, nil, 0, 0, err }
if (neuralNetworkExists == false){
return false, false, 0, nil, nil, 0, 0, errors.New("GetGenomeNumericTraitAnalysis claiming that neural network doesn't exist when we already checked.")
}
if (predictionIsKnown == false){
return false, false, 0, nil, nil, 0, 0, errors.New("GetGenomeNumericTraitAnalysis claiming that prediction is impossible when we already know at least 1 locus value exists for trait.")
}
predictedOutcomesList = append(predictedOutcomesList, predictedOutcome)
if (index == 0){
// These values should be the same for each predicted offspring
accuracyRangesMap = predictionAccuracyRangesMap
quantityOfLociTested = currentQuantityOfLociTested
}
}
// We calculate the average predicted outcome
outcomesSum := float64(0)
for _, predictedOutcome := range predictedOutcomesList{
outcomesSum += predictedOutcome
}
averageOutcome := outcomesSum/100
return true, true, averageOutcome, accuracyRangesMap, predictedOutcomesList, quantityOfLociTested, quantityOfParentalPhasedLoci, nil
}
// This function will return a list of 100 prospective offspring genomes
// Each genome represents an equal-probability offspring genome from both people's genomes
// This function takes into account the effects of genetic linkage

View file

@ -8,6 +8,7 @@ import "seekia/resources/geneticReferences/locusMetadata"
import "seekia/resources/geneticReferences/monogenicDiseases"
import "seekia/resources/geneticReferences/polygenicDiseases"
import "seekia/resources/geneticReferences/traits"
import "seekia/resources/trainedPredictionModels"
import "seekia/internal/genetics/createRawGenomes"
import "seekia/internal/genetics/prepareRawGenomes"
@ -25,8 +26,21 @@ func TestCreateCoupleGeneticAnalysis_SingleGenomes(t *testing.T){
}
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
traits.InitializeTraitVariables()
err = polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = trainedPredictionModels.InitializeTrainedPredictionModels()
if (err != nil) {
t.Fatalf("InitializeTrainedPredictionModels failed: " + err.Error())
}
getPersonGenomesList := func()([]prepareRawGenomes.RawGenomeWithMetadata, error){
@ -100,8 +114,21 @@ func TestCreateCoupleGeneticAnalysis_SingleAndMultipleGenomes(t *testing.T){
}
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
traits.InitializeTraitVariables()
err = polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = trainedPredictionModels.InitializeTrainedPredictionModels()
if (err != nil) {
t.Fatalf("InitializeTrainedPredictionModels failed: " + err.Error())
}
getPersonGenomesList := func(addSecondGenome bool)([]prepareRawGenomes.RawGenomeWithMetadata, error){
@ -189,7 +216,6 @@ func TestCreateCoupleGeneticAnalysis_SingleAndMultipleGenomes(t *testing.T){
}
func TestCreateCoupleGeneticAnalysis_MultipleGenomes(t *testing.T){
err := locusMetadata.InitializeLocusMetadataVariables()
@ -198,8 +224,21 @@ func TestCreateCoupleGeneticAnalysis_MultipleGenomes(t *testing.T){
}
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
traits.InitializeTraitVariables()
err = polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = trainedPredictionModels.InitializeTrainedPredictionModels()
if (err != nil) {
t.Fatalf("InitializeTrainedPredictionModels failed: " + err.Error())
}
getPersonGenomesList := func()([]prepareRawGenomes.RawGenomeWithMetadata, error){

View file

@ -8,10 +8,6 @@ package createPersonGeneticAnalysis
// Disclaimer: I am a novice in the ways of genetics. This package could be flawed in numerous ways.
// TODO: We want to eventually use neural nets for both trait and polygenic disease analysis (see geneticPrediction.go)
// These will be trained on a set of genomes and will output a probability analysis for each trait/disease
// This is only possible once we get access to the necessary training data
// TODO: Add the ability to weight different genome files based on their reliability.
// Some files are much more accurate because they record each location many times.
@ -19,10 +15,10 @@ import "seekia/resources/geneticReferences/locusMetadata"
import "seekia/resources/geneticReferences/monogenicDiseases"
import "seekia/resources/geneticReferences/polygenicDiseases"
import "seekia/resources/geneticReferences/traits"
import "seekia/resources/trainedPredictionModels"
import "seekia/internal/encoding"
import "seekia/internal/genetics/geneticAnalysis"
import "seekia/internal/genetics/geneticPrediction"
import "seekia/internal/genetics/locusValue"
import "seekia/internal/genetics/prepareRawGenomes"
import "seekia/internal/helpers"
@ -40,7 +36,7 @@ func CreatePersonGeneticAnalysis(genomesList []prepareRawGenomes.RawGenomeWithMe
prepareRawGenomesUpdatePercentageCompleteFunction := func(newPercentage int)error{
newPercentageCompletion, err := helpers.ScaleNumberProportionally(true, newPercentage, 0, 100, 0, 50)
newPercentageCompletion, err := helpers.ScaleIntProportionally(true, newPercentage, 0, 100, 0, 50)
if (err != nil){ return err }
err = updatePercentageCompleteFunction(newPercentageCompletion)
@ -49,8 +45,18 @@ func CreatePersonGeneticAnalysis(genomesList []prepareRawGenomes.RawGenomeWithMe
return nil
}
genomesWithMetadataList, allRawGenomeIdentifiersList, multipleGenomesExist, onlyExcludeConflictsGenomeIdentifier, onlyIncludeSharedGenomeIdentifier, err := prepareRawGenomes.GetGenomesWithMetadataListFromRawGenomesList(genomesList, prepareRawGenomesUpdatePercentageCompleteFunction)
anyUsefulLocationsExist, genomesWithMetadataList, allRawGenomeIdentifiersList, multipleGenomesExist, onlyExcludeConflictsGenomeIdentifier, onlyIncludeSharedGenomeIdentifier, err := prepareRawGenomes.GetGenomesWithMetadataListFromRawGenomesList(genomesList, prepareRawGenomesUpdatePercentageCompleteFunction)
if (err != nil) { return false, "", err }
if (anyUsefulLocationsExist == false){
// We should have checked for this when genomes were first imported.
return false, "", errors.New("CreatePersonGeneticAnalysis called with genomeList containing no genomes with useful locations.")
}
if (len(genomesList) > 1 && (len(genomesList) != (len(genomesWithMetadataList)-2)) ){
// If there is more than 1 genome, 2 combined genomes are created
// We are checking to make sure that none of the input genomes were dropped due to not having any locations
// We should have checked to make sure each input genome has useful locations when each genome was first imported.
return false, "", errors.New("CreatePersonGeneticAnalysis called with genomeList containing at least 1 genome without useful locations.")
}
// This map stores each genome's locus values
// Map Structure: Genome Identifier -> Genome locus values map (rsID -> Locus Value)
@ -140,6 +146,14 @@ func CreatePersonGeneticAnalysis(genomesList []prepareRawGenomes.RawGenomeWithMe
if (err != nil) { return false, "", err }
analysisDiscreteTraitsMap[traitName] = personTraitAnalysisObject
} else {
//traitIsDiscreteOrNumeric == "Numeric"
personTraitAnalysisObject, err := GetPersonNumericTraitAnalysis(genomesWithMetadataList, traitObject)
if (err != nil) { return false, "", err }
analysisNumericTraitsMap[traitName] = personTraitAnalysisObject
}
}
@ -649,80 +663,6 @@ func GetPersonMonogenicDiseaseAnalysis(inputGenomesWithMetadataList []prepareRaw
return personMonogenicDiseaseInfoObject, nil
}
//Outputs:
// -bool: Any loci tested
// -int: Person genome risk score (value between 0-10)
// -int: Person Genome Number of loci tested
// -map[[3]byte]geneticAnalysis.PersonGenomePolygenicDiseaseLocusInfo: Person disease locus info map
// Map Structure: Locus Identifier -> PersonGenomePolygenicDiseaseLocusInfo
// -error
func GetPersonGenomePolygenicDiseaseInfo(diseaseLociList []polygenicDiseases.DiseaseLocus, personLocusValuesMap map[int64]locusValue.LocusValue, lookForLocusAliases bool)(bool, int, int, map[[3]byte]geneticAnalysis.PersonGenomePolygenicDiseaseLocusInfo, error){
if (len(personLocusValuesMap) == 0){
return false, 0, 0, nil, nil
}
// Map Structure: Locus Identifier -> PersonGenomePolygenicDiseaseLocusInfo
genomeLociInfoMap := make(map[[3]byte]geneticAnalysis.PersonGenomePolygenicDiseaseLocusInfo)
summedDiseaseRiskWeight := 0
minimumPossibleRiskWeightSum := 0
maximumPossibleRiskWeightSum := 0
for _, locusObject := range diseaseLociList{
locusRSID := locusObject.LocusRSID
locusRiskWeightsMap := locusObject.RiskWeightsMap
locusOddsRatiosMap := locusObject.OddsRatiosMap
locusMinimumWeight := locusObject.MinimumRiskWeight
locusMaximumWeight := locusObject.MaximumRiskWeight
locusValueFound, locusBase1Value, locusBase2Value, _, _, err := GetLocusValueFromGenomeMap(lookForLocusAliases, personLocusValuesMap, locusRSID)
if (err != nil) { return false, 0, 0, nil, err }
if (locusValueFound == false){
continue
}
locusRiskWeight, locusOddsRatioIsKnown, locusOddsRatio, err := GetGenomePolygenicDiseaseLocusRiskInfo(locusRiskWeightsMap, locusOddsRatiosMap, locusBase1Value, locusBase2Value)
if (err != nil) { return false, 0, 0, nil, err }
newLocusInfoObject := geneticAnalysis.PersonGenomePolygenicDiseaseLocusInfo{
RiskWeight: locusRiskWeight,
OddsRatioIsKnown: locusOddsRatioIsKnown,
}
if (locusOddsRatioIsKnown == true){
newLocusInfoObject.OddsRatio = locusOddsRatio
}
locusIdentifierHex := locusObject.LocusIdentifier
locusIdentifier, err := encoding.DecodeHexStringTo3ByteArray(locusIdentifierHex)
if (err != nil) { return false, 0, 0, nil, err }
genomeLociInfoMap[locusIdentifier] = newLocusInfoObject
minimumPossibleRiskWeightSum += locusMinimumWeight
maximumPossibleRiskWeightSum += locusMaximumWeight
summedDiseaseRiskWeight += locusRiskWeight
}
numberOfLociTested := len(genomeLociInfoMap)
if (numberOfLociTested == 0){
// We have no information about this disease for this genome
return false, 0, 0, nil, nil
}
diseaseRiskScore, err := helpers.ScaleNumberProportionally(true, summedDiseaseRiskWeight, minimumPossibleRiskWeightSum, maximumPossibleRiskWeightSum, 0, 10)
if (err != nil) { return false, 0, 0, nil, err }
return true, diseaseRiskScore, numberOfLociTested, genomeLociInfoMap, nil
}
//Outputs:
// -geneticAnalysis.PersonPolygenicDiseaseInfo
// -error
@ -731,8 +671,6 @@ func GetPersonPolygenicDiseaseAnalysis(inputGenomesWithMetadataList []prepareRaw
// We use this when returning errors
emptyDiseaseInfoObject := geneticAnalysis.PersonPolygenicDiseaseInfo{}
diseaseLociList := diseaseObject.LociList
// This map stores the polygenic disease for each of the person's genomes
// Map Structure: Genome Identifier -> PersonGenomePolygenicDiseaseInfo
personPolygenicDiseaseInfoMap := make(map[[16]byte]geneticAnalysis.PersonGenomePolygenicDiseaseInfo)
@ -744,33 +682,17 @@ func GetPersonPolygenicDiseaseAnalysis(inputGenomesWithMetadataList []prepareRaw
genomeIdentifier := genomeWithMetadataObject.GenomeIdentifier
genomeMap := genomeWithMetadataObject.GenomeMap
// This map stores the loci for this disease and does not contain loci which do not belong to this disease
// Map Structure: rsID -> Locus Value
genomeLocusValuesMap := make(map[int64]locusValue.LocusValue)
for _, locusObject := range diseaseLociList{
locusRSID := locusObject.LocusRSID
locusValueFound, _, _, _, locusValueObject, err := GetLocusValueFromGenomeMap(true, genomeMap, locusRSID)
if (err != nil) { return emptyDiseaseInfoObject, err }
if (locusValueFound == false){
continue
}
genomeLocusValuesMap[locusRSID] = locusValueObject
}
anyLociTested, personDiseaseRiskScore, genomeNumberOfLociTested, genomeLociInfoMap, err := GetPersonGenomePolygenicDiseaseInfo(diseaseLociList, genomeLocusValuesMap, true)
neuralNetworkExists, anyLociTested, personDiseaseRiskScore, predictionAccuracyRangesMap, genomeQuantityOfLociKnown, genomeQuantityOfPhasedLoci, err := GetPersonGenomePolygenicDiseaseAnalysis(diseaseObject, genomeMap, true)
if (err != nil) { return emptyDiseaseInfoObject, err }
if (anyLociTested == false){
if (neuralNetworkExists == false || anyLociTested == false){
continue
}
newDiseaseInfoObject := geneticAnalysis.PersonGenomePolygenicDiseaseInfo{
QuantityOfLociTested: genomeNumberOfLociTested,
RiskScore: personDiseaseRiskScore,
LociInfoMap: genomeLociInfoMap,
ConfidenceRangesMap: predictionAccuracyRangesMap,
QuantityOfLociKnown: genomeQuantityOfLociKnown,
QuantityOfPhasedLoci: genomeQuantityOfPhasedLoci,
}
personPolygenicDiseaseInfoMap[genomeIdentifier] = newDiseaseInfoObject
@ -792,75 +714,20 @@ func GetPersonPolygenicDiseaseAnalysis(inputGenomesWithMetadataList []prepareRaw
// First we check to see if any of the genomes have different risk scores or NumberOfLociTested
genomeRiskScore := 0
genomeNumberOfLociTested := 0
personGenomePolygenicDiseaseInfoObject := geneticAnalysis.PersonGenomePolygenicDiseaseInfo{}
firstItemReached := false
for _, personGenomeDiseaseInfoObject := range personPolygenicDiseaseInfoMap{
currentGenomeRiskScore := personGenomeDiseaseInfoObject.RiskScore
currentGenomeNumberOfLociTested := personGenomeDiseaseInfoObject.QuantityOfLociTested
if (firstItemReached == false){
genomeRiskScore = currentGenomeRiskScore
genomeNumberOfLociTested = currentGenomeNumberOfLociTested
personGenomePolygenicDiseaseInfoObject = personGenomeDiseaseInfoObject
firstItemReached = true
continue
}
if (genomeRiskScore != currentGenomeRiskScore){
return true, nil
}
if (genomeNumberOfLociTested != currentGenomeNumberOfLociTested){
return true, nil
}
}
// Now we check for conflicts between the different locus values
// We consider a conflict any time the same locus has different weights/odds ratios
// We don't care if the loci have different base pair values, so long as those base pairs have the same risk weights/odds ratios
for _, locusObject := range diseaseLociList{
locusIdentifierHex := locusObject.LocusIdentifier
locusIdentifier, err := encoding.DecodeHexStringTo3ByteArray(locusIdentifierHex)
if (err != nil) { return false, err }
locusRiskWeight := 0
locusOddsRatio := float64(0)
firstItemReached := false
for _, personGenomeDiseaseInfoObject := range personPolygenicDiseaseInfoMap{
genomeLociInfoMap := personGenomeDiseaseInfoObject.LociInfoMap
genomeLocusObject, exists := genomeLociInfoMap[locusIdentifier]
if (exists == false){
if (firstItemReached == true){
// A previous genome has information for this locus, and the current one does not
return true, nil
}
continue
}
genomeLocusRiskWeight := genomeLocusObject.RiskWeight
genomeLocusOddsRatio := genomeLocusObject.OddsRatio
if (firstItemReached == false){
locusRiskWeight = genomeLocusRiskWeight
locusOddsRatio = genomeLocusOddsRatio
firstItemReached = true
continue
}
if (locusRiskWeight == genomeLocusRiskWeight && locusOddsRatio == genomeLocusOddsRatio){
// No conflict exists for this locus on the genomes we have already checked
continue
}
// Conflict exists
areEqual := reflect.DeepEqual(personGenomeDiseaseInfoObject, personGenomePolygenicDiseaseInfoObject)
if (areEqual == false){
return true, nil
}
}
@ -877,6 +744,67 @@ func GetPersonPolygenicDiseaseAnalysis(inputGenomesWithMetadataList []prepareRaw
}
//Outputs:
// -bool: Neural network exists for disease
// -bool: Any loci tested
// -int: Person genome risk score (value between 0-10)
// -map[int]float64: Confidence ranges map
// -If we want to know how accurate the prediction is with a X% accuracy, how far would we have to expand the
// risk score's range to be accurate, X% of the time?
// -Map Structure: Percentage -> Distance to travel in both directions of prediction
// -int: Person Genome quantity of loci known
// -int: Person genome quantity of phased loci
// -error
func GetPersonGenomePolygenicDiseaseAnalysis(diseaseObject polygenicDiseases.PolygenicDisease, personGenomeMap map[int64]locusValue.LocusValue, checkForAliases bool)(bool, bool, int, map[int]float64, int, int, error){
diseaseLociList := diseaseObject.LociList
getGenomeLocusValuesMap := func()(map[int64]locusValue.LocusValue, error){
if (checkForAliases == false){
// We don't need to check for rsID aliases.
return personGenomeMap, nil
}
// This map contains the locus values for the genome
// If a locus's entry doesn't exist, its value is unknown
// Map Structure: Locus rsID -> Locus Value
genomeLocusValuesMap := make(map[int64]locusValue.LocusValue)
for _, locusRSID := range diseaseLociList{
locusBasePairKnown, _, _, _, locusValueObject, err := GetLocusValueFromGenomeMap(checkForAliases, personGenomeMap, locusRSID)
if (err != nil) { return nil, err }
if (locusBasePairKnown == false){
continue
}
genomeLocusValuesMap[locusRSID] = locusValueObject
}
return genomeLocusValuesMap, nil
}
genomeLocusValuesMap, err := getGenomeLocusValuesMap()
if (err != nil) { return false, false, 0, nil, 0, 0, err }
diseaseName := diseaseObject.DiseaseName
neuralNetworkModelExists, riskScorePredictionIsPossible, predictedRiskScore, predictionAccuracyRangesMap, quantityOfLociKnown, quantityOfPhasedLoci, err := trainedPredictionModels.GetNeuralNetworkNumericAttributePredictionFromGenomeMap(diseaseName, diseaseLociList, genomeLocusValuesMap)
if (err != nil) { return false, false, 0, nil, 0, 0, err }
if (neuralNetworkModelExists == false){
return false, false, 0, nil, 0, 0, nil
}
if (riskScorePredictionIsPossible == false){
return true, false, 0, nil, 0, 0, nil
}
predictedRiskScoreInt := int(predictedRiskScore)
return true, true, predictedRiskScoreInt, predictionAccuracyRangesMap, quantityOfLociKnown, quantityOfPhasedLoci, nil
}
//Outputs:
// -geneticAnalysis.PersonDiscreteTraitInfo: Trait analysis object
// -error
@ -984,34 +912,78 @@ func GetPersonDiscreteTraitAnalysis(inputGenomesWithMetadataList []prepareRawGen
//Outputs:
// -int: Base pair disease locus risk weight
// -bool: Base pair disease locus odds ratio known
// -float64: Base pair disease locus odds ratio
// -geneticAnalysis.PersonNumericTraitInfo: Trait analysis object
// -error
func GetGenomePolygenicDiseaseLocusRiskInfo(locusRiskWeightsMap map[string]int, locusOddsRatiosMap map[string]float64, locusBase1Value string, locusBase2Value string)(int, bool, float64, error){
func GetPersonNumericTraitAnalysis(inputGenomesWithMetadataList []prepareRawGenomes.GenomeWithMetadata, traitObject traits.Trait)(geneticAnalysis.PersonNumericTraitInfo, error){
locusBasePairJoined := locusBase1Value + ";" + locusBase2Value
// Map Structure: Genome Identifier -> PersonGenomeNumericTraitInfo
newPersonTraitInfoMap := make(map[[16]byte]geneticAnalysis.PersonGenomeNumericTraitInfo)
riskWeight, exists := locusRiskWeightsMap[locusBasePairJoined]
if (exists == false){
// This is an unknown base combination
// We will treat it as a 0 risk weight
return 0, true, 1, nil
for _, genomeWithMetadataObject := range inputGenomesWithMetadataList{
genomeIdentifier := genomeWithMetadataObject.GenomeIdentifier
genomeMap := genomeWithMetadataObject.GenomeMap
neuralNetworkExists, neuralNetworkOutcomeIsKnown, predictedOutcome, predictionConfidenceRangesMap, quantityOfLociKnown, quantityOfPhasedLoci, err := GetGenomeNumericTraitAnalysis(traitObject, genomeMap, true)
if (err != nil) { return geneticAnalysis.PersonNumericTraitInfo{}, err }
if (neuralNetworkExists == false || neuralNetworkOutcomeIsKnown == false){
continue
}
newPersonGenomeTraitInfo := geneticAnalysis.PersonGenomeNumericTraitInfo{
PredictedOutcome: predictedOutcome,
ConfidenceRangesMap: predictionConfidenceRangesMap,
QuantityOfLociKnown: quantityOfLociKnown,
QuantityOfPhasedLoci: quantityOfPhasedLoci,
}
newPersonTraitInfoMap[genomeIdentifier] = newPersonGenomeTraitInfo
}
if (riskWeight == 0){
return 0, true, 1, nil
newPersonTraitInfoObject := geneticAnalysis.PersonNumericTraitInfo{
TraitInfoMap: newPersonTraitInfoMap,
}
oddsRatio, exists := locusOddsRatiosMap[locusBasePairJoined]
if (exists == false){
return riskWeight, false, 0, nil
if (len(newPersonTraitInfoMap) <= 1){
// We do not need to check for conflicts, there is only <=1 genome with trait information
// Nothing left to do. Analysis is complete.
return newPersonTraitInfoObject, nil
}
return riskWeight, true, oddsRatio, nil
// We check for conflicts
getConflictExistsBool := func()(bool, error){
// We check to see if the analysis results are the same for all genomes
firstItemReached := false
personGenomeTraitInfoObject := geneticAnalysis.PersonGenomeNumericTraitInfo{}
for _, genomeTraitInfoObject := range newPersonTraitInfoMap{
if (firstItemReached == false){
personGenomeTraitInfoObject = genomeTraitInfoObject
continue
}
areEqual := reflect.DeepEqual(personGenomeTraitInfoObject, genomeTraitInfoObject)
if (areEqual == false){
return true, nil
}
}
return false, nil
}
conflictExists, err := getConflictExistsBool()
if (err != nil) { return geneticAnalysis.PersonNumericTraitInfo{}, err }
newPersonTraitInfoObject.ConflictExists = conflictExists
return newPersonTraitInfoObject, nil
}
// We use this to generate trait predictions using a neural network
// We use this to generate discrete trait predictions using a neural network
// The alternative prediction method is to use Rules (see GetGenomeTraitAnalysis_Rules)
//Outputs:
// -bool: Trait Neural network analysis available (if false, we can't predict this trait using a neural network)
@ -1056,7 +1028,9 @@ func GetGenomeDiscreteTraitAnalysis_NeuralNetwork(traitObject traits.Trait, geno
traitName := traitObject.TraitName
neuralNetworkModelExists, traitPredictionIsPossible, predictedOutcome, predictionConfidence, quantityOfLociKnown, quantityOfPhasedLoci, err := geneticPrediction.GetNeuralNetworkTraitPredictionFromGenomeMap(traitName, genomeLocusValuesMap)
traitLociList := traitObject.LociList
neuralNetworkModelExists, traitPredictionIsPossible, predictedOutcome, predictionConfidence, quantityOfLociKnown, quantityOfPhasedLoci, err := trainedPredictionModels.GetNeuralNetworkDiscreteTraitPredictionFromGenomeMap(traitName, traitLociList, genomeLocusValuesMap)
if (err != nil) { return false, false, "", 0, 0, 0, err }
if (neuralNetworkModelExists == false){
return false, false, "", 0, 0, 0, nil
@ -1253,6 +1227,65 @@ func GetGenomePassesDiscreteTraitRuleStatus(ruleLociList []traits.RuleLocus, gen
}
// We use this to generate numeric trait predictions using a neural network
//Outputs:
// -bool: Trait Neural network analysis available (if false, we can't predict this trait using a neural network)
// -bool: Neural network outcome is known (at least 1 locus value is known which is needed for the neural network
// -float64: The predicted value (Example: Height in centimeters)
// -map[int]float64: Accuracy ranges map
// -Map Structure: Probability prediction is accurate (X) -> Distance from predictoin that must be travelled in both directions to
// create a range in which the true value will fall into, X% of the time
// -int: Quantity of loci known
// -int: Quantity of phased loci
// -error
func GetGenomeNumericTraitAnalysis(traitObject traits.Trait, genomeMap map[int64]locusValue.LocusValue, checkForAliases bool)(bool, bool, float64, map[int]float64, int, int, error){
traitLociList := traitObject.LociList
getGenomeLocusValuesMap := func()(map[int64]locusValue.LocusValue, error){
if (checkForAliases == false){
// We don't need to check for rsID aliases.
return genomeMap, nil
}
// This map contains the locus values for the genome
// If a locus's entry doesn't exist, its value is unknown
// Map Structure: Locus rsID -> Locus Value
genomeLocusValuesMap := make(map[int64]locusValue.LocusValue)
for _, locusRSID := range traitLociList{
locusBasePairKnown, _, _, _, locusValueObject, err := GetLocusValueFromGenomeMap(checkForAliases, genomeMap, locusRSID)
if (err != nil) { return nil, err }
if (locusBasePairKnown == false){
continue
}
genomeLocusValuesMap[locusRSID] = locusValueObject
}
return genomeLocusValuesMap, nil
}
genomeLocusValuesMap, err := getGenomeLocusValuesMap()
if (err != nil) { return false, false, 0, nil, 0, 0, err }
traitName := traitObject.TraitName
neuralNetworkModelExists, traitPredictionIsPossible, predictedOutcome, predictionAccuracyRangesMap, quantityOfLociKnown, quantityOfPhasedLoci, err := trainedPredictionModels.GetNeuralNetworkNumericAttributePredictionFromGenomeMap(traitName, traitLociList, genomeLocusValuesMap)
if (err != nil) { return false, false, 0, nil, 0, 0, err }
if (neuralNetworkModelExists == false){
return false, false, 0, nil, 0, 0, nil
}
if (traitPredictionIsPossible == false){
return true, false, 0, nil, 0, 0, nil
}
return true, true, predictedOutcome, predictionAccuracyRangesMap, quantityOfLociKnown, quantityOfPhasedLoci, nil
}
// This function will retrieve the base pair of the locus from the input genome map
// We use this function because each rsID has aliases, so we must sometimes check those aliases to find locus values
//

View file

@ -8,6 +8,7 @@ import "seekia/resources/geneticReferences/locusMetadata"
import "seekia/resources/geneticReferences/monogenicDiseases"
import "seekia/resources/geneticReferences/polygenicDiseases"
import "seekia/resources/geneticReferences/traits"
import "seekia/resources/trainedPredictionModels"
import "seekia/internal/genetics/createRawGenomes"
import "seekia/internal/genetics/prepareRawGenomes"
@ -25,8 +26,21 @@ func TestCreatePersonGeneticAnalysis_SingleGenome(t *testing.T){
}
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
traits.InitializeTraitVariables()
err = polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = trainedPredictionModels.InitializeTrainedPredictionModels()
if (err != nil) {
t.Fatalf("InitializeTrainedPredictionModels failed: " + err.Error())
}
genomeIdentifier, err := helpers.GetNewRandom16ByteArray()
if (err != nil) {
@ -84,8 +98,21 @@ func TestCreatePersonGeneticAnalysis_MultipleGenomes(t *testing.T){
}
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
traits.InitializeTraitVariables()
err = polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = trainedPredictionModels.InitializeTrainedPredictionModels()
if (err != nil) {
t.Fatalf("InitializeTrainedPredictionModels failed: " + err.Error())
}
numberOfGenomesToAdd := helpers.GetRandomIntWithinRange(2, 5)

View file

@ -4,8 +4,6 @@
package createRawGenomes
import "seekia/resources/geneticReferences/locusMetadata"
import "seekia/internal/genetics/readRawGenomes"
import "seekia/internal/helpers"
import "seekia/internal/unixTime"
@ -25,11 +23,6 @@ import "strings"
// -error
func CreateFakeRawGenome_23andMe()(string, int64, int64, map[int64]readRawGenomes.RawGenomeLocusValue, error){
err := locusMetadata.InitializeLocusMetadataVariables()
if (err != nil){
return "", 0, 0, nil, errors.New("InitializeLocusMetadataVariables failed: " + err.Error())
}
yearUnix := unixTime.GetYearUnix()
maximumTime := time.Now().Unix()
@ -99,7 +92,7 @@ func CreateFakeRawGenome_23andMe()(string, int64, int64, map[int64]readRawGenome
# rsid chromosome position genotype
`
_, err = fileContentsBuilder.WriteString(fileHeader)
_, err := fileContentsBuilder.WriteString(fileHeader)
if (err != nil){
return "", 0, 0, nil, errors.New("Failed to WriteString to string builder: " + err.Error())
}
@ -215,11 +208,6 @@ func CreateFakeRawGenome_23andMe()(string, int64, int64, map[int64]readRawGenome
// -error
func CreateFakeRawGenome_AncestryDNA()(string, int64, int64, map[int64]readRawGenomes.RawGenomeLocusValue, error){
err := locusMetadata.InitializeLocusMetadataVariables()
if (err != nil){
return "", 0, 0, nil, errors.New("InitializeLocusMetadataVariables failed: " + err.Error())
}
yearUnix := unixTime.GetYearUnix()
maximumTime := time.Now().Unix()
@ -282,7 +270,7 @@ func CreateFakeRawGenome_AncestryDNA()(string, int64, int64, map[int64]readRawGe
rsid chromosome position allele1 allele2
`
_, err = fileContentsBuilder.WriteString(fileHeader)
_, err := fileContentsBuilder.WriteString(fileHeader)
if (err != nil){
return "", 0, 0, nil, errors.New("Failed to WriteString to string builder: " + err.Error())
}

View file

@ -107,32 +107,21 @@ type PersonPolygenicDiseaseInfo struct{
type PersonGenomePolygenicDiseaseInfo struct{
// This describes the quantity of loci tested for this disease
// This should be len(LociInfoMap)
QuantityOfLociTested int
// This is total risk score for this disease for the person's genome
// This is a number between 1-10
RiskScore int
// This map contains info about all tested polygenic disease loci for this genome
// If a locus does not exist in the map, its values are unknown
// Map Structure: Locus Identifier -> PersonGenomePolygenicDiseaseLocusInfo
LociInfoMap map[[3]byte]PersonGenomePolygenicDiseaseLocusInfo
}
// This map stores the confidence ranges for the predicted risk score
// If we want to know how accurate the prediction is with a X% accuracy, how far would we have to expand the
// risk score's range to be accurate, X% of the time?
// For example: 50% accuracy requires a +/-2 point range, 80% accuracy requires a +-5 point range
// Map Structure: Accuracy probability (0-100) -> Amount to add to value in both +/- directions so prediction is that accurate
ConfidenceRangesMap map[int]float64
type PersonGenomePolygenicDiseaseLocusInfo struct{
// This describes the quantity of loci tested for this disease
QuantityOfLociKnown int
// This is the risk weight that this person's genome has for this variant
// A higher risk weight means more risk of getting the disease
RiskWeight int
// This is valse if the odds ratio is not known
OddsRatioIsKnown bool
// This is the person's genome odds ratio value for this variant's locus
// A ratio >1 means their risk is increased, a ratio <1 means their risk is decreased
OddsRatio float64
QuantityOfPhasedLoci int
}
@ -325,7 +314,6 @@ type OffspringMonogenicDiseaseVariantInfo struct{
ProbabilityOf2MutationsUpperBound int
}
type OffspringPolygenicDiseaseInfo struct{
// This map stores the polygenic disease info for each genome pair
@ -339,16 +327,23 @@ type OffspringPolygenicDiseaseInfo struct{
type OffspringGenomePairPolygenicDiseaseInfo struct{
// This should be len(LociInfoMap)
QuantityOfLociTested int
// A number between 1-10 representing the offspring's average risk score
// 1 == lowest risk, 10 == highest risk
OffspringAverageRiskScore int
// A map of the offspring's locus information
// Map Structure: Locus Identifier -> OffspringPolygenicDiseaseLocusInfo
LociInfoMap map[[3]byte]OffspringPolygenicDiseaseLocusInfo
// This map stores the confidence ranges for the predicted risk score
// If we want to know how accurate the prediction is with a X% accuracy, how far would we have to expand the
// risk score's range to be accurate, X% of the time?
// For example: 50% accuracy requires a +/-2 point range, 80% accuracy requires a +-3 point range
// Map Structure: Accuracy probability (0-100) -> Amount to add to value in both +/- directions so prediction is that accurate
PredictionConfidenceRangesMap map[int]float64
QuantityOfLociKnown int
// This describes the quantity of loci from both parents that are phased
// For example, if there are 10 loci for this trait, and one parent has 10 phased loci and the other has 5,
// this variable will have a value of 15
QuantityOfParentalPhasedLoci int
// This is a list of prospective offspring risk scores
// This is useful for plotting on a graph to understand the standard deviation of risk
@ -356,27 +351,6 @@ type OffspringGenomePairPolygenicDiseaseInfo struct{
}
type OffspringPolygenicDiseaseLocusInfo struct{
// This is the offspring's average risk weight for this locus value
// A higher weight means a higher risk of the disease
OffspringAverageRiskWeight int
// This is true if any of the 100 prospective offspring had a known odds ratio for this locus
OffspringOddsRatioIsKnown bool
// This value represent's the offspring's average odds ratio for the disease locus
// A value <1 denotes a lesser risk, a value >1 denotes an increased risk
OffspringAverageOddsRatio float64
// This is the average of the sum of weights for the loci which have no odds ratios for each prospective offspring
// We do this to understand what effect those loci are having on the odds ratio
// If the sum is <0, we say the ratio is probably lower
// If the sum is >0, we say the ratio is probably higher
OffspringAverageUnknownOddsRatiosWeightSum int
}
type OffspringDiscreteTraitInfo struct{
// This map stores the trait info for each genome pair
@ -469,15 +443,15 @@ type OffspringGenomePairNumericTraitInfo struct{
// predicted value's range to be accurate, X% of the time?
// For example: 50% accuracy requires a +/-5 point range, 80% accuracy requires a +-15 point range
// Map Structure: Accuracy probability (0-100) -> Amount to add to value in both +/- directions so prediction is that accurate
AverageConfidenceRangesMap map[int]float64
PredictionConfidenceRangesMap map[int]float64
QuantityOfLociKnown int
// This describes the quantity of loci from both parents that are phased
// For example, if there are 10 loci for this trait, and one parent has 10 phased loci and the other has 5,
// this variable will have a value of 15
QuantityOfParentalPhasedLoci int
QuantityOfLociKnown int
// A list of 100 offspring outcomes for 100 prospective offspring from the genome pair
// Example: A list of heights for 100 prospective offspring
SampleOffspringOutcomesList []float64

File diff suppressed because it is too large Load diff

View file

@ -2,6 +2,7 @@ package geneticPrediction_test
import "seekia/internal/genetics/geneticPrediction"
import "seekia/internal/genetics/geneticPredictionModels"
import "testing"
@ -14,12 +15,12 @@ func TestEncodeNeuralNetwork(t *testing.T){
t.Fatalf("GetNewUntrainedNeuralNetworkObject failed: " + err.Error())
}
neuralNetworkBytes, err := geneticPrediction.EncodeNeuralNetworkObjectToBytes(*neuralNetworkObject)
neuralNetworkBytes, err := geneticPredictionModels.EncodeNeuralNetworkObjectToBytes(*neuralNetworkObject)
if (err != nil){
t.Fatalf("EncodeNeuralNetworkObjectToBytes failed: " + err.Error())
}
_, err = geneticPrediction.DecodeBytesToNeuralNetworkObject(neuralNetworkBytes)
_, err = geneticPredictionModels.DecodeBytesToNeuralNetworkObject(neuralNetworkBytes)
if (err != nil){
t.Fatalf("DecodeBytesToNeuralNetworkObject failed: " + err.Error())
}

View file

@ -0,0 +1,230 @@
// geneticPredictionModels provides the data structures and functions to represent, encode, and decode genetic prediction models
// Prediction models are used to predict polygenic disease risk scores and trait outcomes
package geneticPredictionModels
import "gorgonia.org/gorgonia"
import "gorgonia.org/tensor"
import "bytes"
import "encoding/gob"
import "errors"
type NeuralNetwork struct{
// ExprGraph is a data structure for a directed acyclic graph (of expressions).
Graph *gorgonia.ExprGraph
// These are the weights for each layer of neurons
Weights1 *gorgonia.Node
Weights2 *gorgonia.Node
Weights3 *gorgonia.Node
// This is the computed prediction
Prediction *gorgonia.Node
}
// This function returns the weights of the neural network
// We need this for training
func (inputNetwork *NeuralNetwork)GetLearnables()gorgonia.Nodes{
weights1 := inputNetwork.Weights1
weights2 := inputNetwork.Weights2
weights3 := inputNetwork.Weights3
result := gorgonia.Nodes{weights1, weights2, weights3}
return result
}
// We use this to store a neural network's weights as a .gob file
type neuralNetworkForEncoding struct{
// These are the weights for each layer of neurons
Weights1 []float32
Weights2 []float32
Weights3 []float32
// These represent the quantity of rows and columns for each weight layer
Weights1Rows int
Weights1Columns int
Weights2Rows int
Weights2Columns int
Weights3Rows int
Weights3Columns int
}
func EncodeNeuralNetworkObjectToBytes(inputNeuralNetwork NeuralNetwork)([]byte, error){
weights1 := inputNeuralNetwork.Weights1
weights2 := inputNeuralNetwork.Weights2
weights3 := inputNeuralNetwork.Weights3
weights1Slice := weights1.Value().Data().([]float32)
weights2Slice := weights2.Value().Data().([]float32)
weights3Slice := weights3.Value().Data().([]float32)
weights1Rows := weights1.Shape()[0]
weights1Columns := weights1.Shape()[1]
weights2Rows := weights2.Shape()[0]
weights2Columns := weights2.Shape()[1]
weights3Rows := weights3.Shape()[0]
weights3Columns := weights3.Shape()[1]
newNeuralNetworkForEncoding := neuralNetworkForEncoding{
Weights1: weights1Slice,
Weights2: weights2Slice,
Weights3: weights3Slice,
Weights1Rows: weights1Rows,
Weights1Columns: weights1Columns,
Weights2Rows: weights2Rows,
Weights2Columns: weights2Columns,
Weights3Rows: weights3Rows,
Weights3Columns: weights3Columns,
}
buffer := new(bytes.Buffer)
encoder := gob.NewEncoder(buffer)
err := encoder.Encode(newNeuralNetworkForEncoding)
if (err != nil) { return nil, err }
neuralNetworkBytes := buffer.Bytes()
return neuralNetworkBytes, nil
}
func DecodeBytesToNeuralNetworkObject(inputNeuralNetwork []byte)(NeuralNetwork, error){
if (inputNeuralNetwork == nil){
return NeuralNetwork{}, errors.New("DecodeBytesToNeuralNetworkObject called with nil inputNeuralNetwork.")
}
buffer := bytes.NewBuffer(inputNeuralNetwork)
decoder := gob.NewDecoder(buffer)
var newNeuralNetworkForEncoding neuralNetworkForEncoding
err := decoder.Decode(&newNeuralNetworkForEncoding)
if (err != nil){ return NeuralNetwork{}, err }
weights1 := newNeuralNetworkForEncoding.Weights1
weights2 := newNeuralNetworkForEncoding.Weights2
weights3 := newNeuralNetworkForEncoding.Weights3
weights1Rows := newNeuralNetworkForEncoding.Weights1Rows
weights1Columns := newNeuralNetworkForEncoding.Weights1Columns
weights2Rows := newNeuralNetworkForEncoding.Weights2Rows
weights2Columns := newNeuralNetworkForEncoding.Weights2Columns
weights3Rows := newNeuralNetworkForEncoding.Weights3Rows
weights3Columns := newNeuralNetworkForEncoding.Weights3Columns
// This is the graph object we add each layer to
newGraph := gorgonia.NewGraph()
// A layer is a column of neurons
// Each neuron has an initial value between 0 and 1
getNewNeuralNetworkLayerWeights := func(layerName string, layerNeuronRows int, layerNeuronColumns int, layerWeightsList []float32)*gorgonia.Node{
layerNameObject := gorgonia.WithName(layerName)
layerBacking := tensor.WithBacking(layerWeightsList)
layerShape := tensor.WithShape(layerNeuronRows, layerNeuronColumns)
layerTensor := tensor.New(layerBacking, layerShape)
layerValueObject := gorgonia.WithValue(layerTensor)
layerObject := gorgonia.NewMatrix(newGraph, tensor.Float32, layerNameObject, layerValueObject)
return layerObject
}
layer1 := getNewNeuralNetworkLayerWeights("Weights1", weights1Rows, weights1Columns, weights1)
layer2 := getNewNeuralNetworkLayerWeights("Weights2", weights2Rows, weights2Columns, weights2)
layer3 := getNewNeuralNetworkLayerWeights("Weights3", weights3Rows, weights3Columns, weights3)
newNeuralNetworkObject := NeuralNetwork{
Graph: newGraph,
Weights1: layer1,
Weights2: layer2,
Weights3: layer3,
}
return newNeuralNetworkObject, nil
}
// This function will take a neural network and input layer and build the network to be able to compute a prediction
// We need to run a virtual machine after calling this function in order for the prediction to be generated
func (inputNetwork *NeuralNetwork)BuildNeuralNetwork(inputLayer *gorgonia.Node, predictionIsNumeric bool)error{
// We copy node pointer (says to do this in a resource i'm reading)
inputLayerCopy := inputLayer
// We multiply weights at each layer and perform ReLU (Rectification) after each multiplication
weights1 := inputNetwork.Weights1
layer1Product, err := gorgonia.Mul(inputLayerCopy, weights1)
if (err != nil) {
return errors.New("Layer 1 multiplication failed: " + err.Error())
}
layer1ProductRectified, err := gorgonia.Rectify(layer1Product)
if (err != nil){
return errors.New("Layer 1 Rectify failed: " + err.Error())
}
weights2 := inputNetwork.Weights2
layer2Product, err := gorgonia.Mul(layer1ProductRectified, weights2)
if (err != nil) {
return errors.New("Layer 2 multiplication failed: " + err.Error())
}
layer2ProductRectified, err := gorgonia.Rectify(layer2Product)
if (err != nil){
return errors.New("Layer 2 Rectify failed: " + err.Error())
}
weights3 := inputNetwork.Weights3
layer3Product, err := gorgonia.Mul(layer2ProductRectified, weights3)
if (err != nil) {
return errors.New("Layer 3 multiplication failed: " + err.Error())
}
if (predictionIsNumeric == false){
// We SoftMax the output to get the prediction
prediction, err := gorgonia.SoftMax(layer3Product)
if (err != nil) {
return errors.New("SoftMax failed: " + err.Error())
}
inputNetwork.Prediction = prediction
} else {
// We Sigmoid the output to get the prediction
prediction, err := gorgonia.Sigmoid(layer3Product)
if (err != nil) {
return errors.New("Sigmoid failed: " + err.Error())
}
inputNetwork.Prediction = prediction
}
return nil
}

View file

@ -627,7 +627,7 @@ func StartCreateNewPersonGeneticAnalysis(personIdentifier string)(string, error)
for index, genomeMap := range personGenomesMapList{
newPercentageProgress, err := helpers.ScaleNumberProportionally(true, index, 0, finalIndex, 0, 10)
newPercentageProgress, err := helpers.ScaleIntProportionally(true, index, 0, finalIndex, 0, 10)
if (err != nil) { return err }
err = updatePercentageCompleteFunction(newPercentageProgress)
@ -665,7 +665,7 @@ func StartCreateNewPersonGeneticAnalysis(personIdentifier string)(string, error)
analysisUpdatePercentageCompleteFunction := func(inputProgress int)error{
newPercentageProgress, err := helpers.ScaleNumberProportionally(true, inputProgress, 0, 100, 10, 10)
newPercentageProgress, err := helpers.ScaleIntProportionally(true, inputProgress, 0, 100, 10, 10)
if (err != nil) { return err }
err = updatePercentageCompleteFunction(newPercentageProgress)
@ -899,7 +899,7 @@ func StartCreateNewCoupleGeneticAnalysis(inputPerson1Identifier string, inputPer
break
}
personPercentageComplete, err := helpers.ScaleNumberProportionally(true, processPercentageComplete, 0, 100, personPercentageRangeStart, personPercentageRangeEnd)
personPercentageComplete, err := helpers.ScaleIntProportionally(true, processPercentageComplete, 0, 100, personPercentageRangeStart, personPercentageRangeEnd)
if (err != nil) { return err }
err = updatePercentageCompleteFunction(personPercentageComplete)
@ -978,7 +978,7 @@ func StartCreateNewCoupleGeneticAnalysis(inputPerson1Identifier string, inputPer
updateCoupleAnalysisPercentageCompleteFunction := func(newPercentage int)error{
personPercentageComplete, err := helpers.ScaleNumberProportionally(true, newPercentage, 0, 100, 74, 100)
personPercentageComplete, err := helpers.ScaleIntProportionally(true, newPercentage, 0, 100, 74, 100)
if (err != nil) { return err }
err = updatePercentageCompleteFunction(personPercentageComplete)

View file

@ -68,16 +68,17 @@ func CreateRawGenomeWithMetadataObject(genomeIdentifier [16]byte, rawGenomeStrin
// -[]RawGenomeWithMetadata
// -func(int)error: Update Percentage Complete Function
//Outputs:
// -bool: Any useful locations exist in any of the provided genomes
// -[]GenomeWithMetadata: Genomes with metadata list
// -[][16]byte: All raw genome identifiers list (not including combined genomes)
// -bool: Combined genomes exist
// -[16]byte: Only exclude conflicts genome identifier
// -[16]byte: Only include shared genome identifier
// -error
func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWithMetadata, updatePercentageCompleteFunction func(int)error)([]GenomeWithMetadata, [][16]byte, bool, [16]byte, [16]byte, error){
func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWithMetadata, updatePercentageCompleteFunction func(int)error)(bool, []GenomeWithMetadata, [][16]byte, bool, [16]byte, [16]byte, error){
if (len(inputGenomesList) == 0){
return nil, nil, false, [16]byte{}, [16]byte{}, errors.New("GetGenomesWithMetadataListFromRawGenomesList called with empty inputGenomesList")
return false, nil, nil, false, [16]byte{}, [16]byte{}, errors.New("GetGenomesWithMetadataListFromRawGenomesList called with empty inputGenomesList")
}
// The reading of genomes will take up the first 20% of the percentage range
@ -87,18 +88,17 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
// Each map stores a genome from a company or a combined genome.
genomesWithMetadataList := make([]GenomeWithMetadata, 0)
numberOfGenomesRead := 0
totalNumberOfGenomesToRead := len(inputGenomesList)
finalIndex := len(inputGenomesList) - 1
allRawGenomeIdentifiersList := make([][16]byte, 0)
for _, rawGenomeWithMetadataObject := range inputGenomesList{
for index, rawGenomeWithMetadataObject := range inputGenomesList{
newPercentageCompletion, err := helpers.ScaleNumberProportionally(true, numberOfGenomesRead, 0, totalNumberOfGenomesToRead, 0, 20)
if (err != nil) { return nil, nil, false, [16]byte{}, [16]byte{}, err }
newPercentageCompletion, err := helpers.ScaleIntProportionally(true, index, 0, finalIndex, 0, 20)
if (err != nil) { return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
err = updatePercentageCompleteFunction(newPercentageCompletion)
if (err != nil) { return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil) { return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
genomeIdentifier := rawGenomeWithMetadataObject.GenomeIdentifier
genomeIsPhased := rawGenomeWithMetadataObject.GenomeIsPhased
@ -107,12 +107,11 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
// Now we convert rawGenomeMap to a genomeMap
anyValuesExist, genomeMap, err := ConvertRawGenomeToGenomeMap(rawGenomeMap, genomeIsPhased)
if (err != nil) { return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil) { return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
if (anyValuesExist == false){
// We have to make sure this never happens so the user isn't confused as to why genomes
// that were imported were not included in the analysis
// We make sure this doesn't happen by verifying the genome at the time of importing
return nil, nil, false, [16]byte{}, [16]byte{}, errors.New("Genome supplied to GetGenomesWithMetadataListFromRawGenomesList has no valid locations.")
// This genome is not useful
// No useful locations exist
continue
}
genomeWithMetadataObject := GenomeWithMetadata{
@ -123,17 +122,20 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
genomesWithMetadataList = append(genomesWithMetadataList, genomeWithMetadataObject)
allRawGenomeIdentifiersList = append(allRawGenomeIdentifiersList, genomeIdentifier)
}
numberOfGenomesRead += 1
if (len(genomesWithMetadataList) == 0){
// None of the provided genomes contained any useful locations
return false, nil, nil, false, [16]byte{}, [16]byte{}, nil
}
containsDuplicates, _ := helpers.CheckIfListContainsDuplicates(allRawGenomeIdentifiersList)
if (containsDuplicates == true){
return nil, nil, false, [16]byte{}, [16]byte{}, errors.New("GetGenomesWithMetadataListFromRawGenomesList called with inputGenomesList containing duplicate genomeIdentifiers.")
return false, nil, nil, false, [16]byte{}, [16]byte{}, errors.New("GetGenomesWithMetadataListFromRawGenomesList called with inputGenomesList containing duplicate genomeIdentifiers.")
}
err := updatePercentageCompleteFunction(20)
if (err != nil){ return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil){ return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
if (len(genomesWithMetadataList) <= 1){
@ -141,9 +143,9 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
// No genome combining is needed.
err = updatePercentageCompleteFunction(100)
if (err != nil){ return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil){ return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
return genomesWithMetadataList, allRawGenomeIdentifiersList, false, [16]byte{}, [16]byte{}, nil
return true, genomesWithMetadataList, allRawGenomeIdentifiersList, false, [16]byte{}, [16]byte{}, nil
}
// Now we create the shared genomes
@ -156,15 +158,15 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
// This map stores all RSIDs across all genomes
allRSIDsMap := make(map[int64]struct{})
finalIndex := len(genomesWithMetadataList) - 1
finalIndex = len(genomesWithMetadataList) - 1
for index, genomeWithMetadataObject := range genomesWithMetadataList{
newPercentageCompletion, err := helpers.ScaleNumberProportionally(true, index, 0, finalIndex, 20, 50)
if (err != nil){ return nil, nil, false, [16]byte{}, [16]byte{}, err }
newPercentageCompletion, err := helpers.ScaleIntProportionally(true, index, 0, finalIndex, 20, 50)
if (err != nil){ return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
err = updatePercentageCompleteFunction(newPercentageCompletion)
if (err != nil){ return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil){ return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
genomeMap := genomeWithMetadataObject.GenomeMap
@ -185,11 +187,11 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
for rsID, _ := range allRSIDsMap{
newPercentageCompletion, err := helpers.ScaleNumberProportionally(true, index, 0, finalIndex, 50, 100)
if (err != nil){ return nil, nil, false, [16]byte{}, [16]byte{}, err }
newPercentageCompletion, err := helpers.ScaleIntProportionally(true, index, 0, finalIndex, 50, 100)
if (err != nil){ return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
err = updatePercentageCompleteFunction(newPercentageCompletion)
if (err != nil){ return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil){ return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
index += 1
@ -200,7 +202,7 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
}
anyAliasesExist, rsidAliasesList, err := locusMetadata.GetRSIDAliases(rsID)
if (err != nil){ return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil){ return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
if (anyAliasesExist == true){
for _, rsidAlias := range rsidAliasesList{
@ -386,7 +388,7 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
}
locusBase1, locusBase2, phaseIsKnown_OnlyExcludeConflicts, phaseIsKnown_OnlyIncludeShared, err := getLocusBasePair()
if (err != nil){ return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil){ return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
// Now we add to the combined genome maps
// The OnlyExcludeConflicts will only omit when there is a tie
@ -436,7 +438,7 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
}
onlyExcludeConflictsGenomeIdentifier, err := helpers.GetNewRandom16ByteArray()
if (err != nil) { return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil) { return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
onlyExcludeConflictsGenomeWithMetadataObject := GenomeWithMetadata{
GenomeType: "OnlyExcludeConflicts",
@ -445,7 +447,7 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
}
onlyIncludeSharedGenomeIdentifier, err := helpers.GetNewRandom16ByteArray()
if (err != nil) { return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil) { return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
onlyIncludeSharedGenomeWithMetadataObject := GenomeWithMetadata{
GenomeType: "OnlyIncludeShared",
@ -456,9 +458,9 @@ func GetGenomesWithMetadataListFromRawGenomesList(inputGenomesList []RawGenomeWi
genomesWithMetadataList = append(genomesWithMetadataList, onlyExcludeConflictsGenomeWithMetadataObject, onlyIncludeSharedGenomeWithMetadataObject)
err = updatePercentageCompleteFunction(100)
if (err != nil){ return nil, nil, false, [16]byte{}, [16]byte{}, err }
if (err != nil){ return false, nil, nil, false, [16]byte{}, [16]byte{}, err }
return genomesWithMetadataList, allRawGenomeIdentifiersList, true, onlyExcludeConflictsGenomeIdentifier, onlyIncludeSharedGenomeIdentifier, nil
return true, genomesWithMetadataList, allRawGenomeIdentifiersList, true, onlyExcludeConflictsGenomeIdentifier, onlyIncludeSharedGenomeIdentifier, nil
}
//Outputs:

File diff suppressed because it is too large Load diff

View file

@ -483,193 +483,74 @@ func GetOffspringMonogenicDiseaseVariantInfoFromGeneticAnalysis(coupleAnalysisOb
return true, probabilityOf0MutationsLowerBound, probabilityOf0MutationsUpperBound, probabilityOf0MutationsFormatted, probabilityOf1MutationLowerBound, probabilityOf1MutationUpperBound, probabilityOf1MutationFormatted, probabilityOf2MutationsLowerBound, probabilityOf2MutationsUpperBound, probabilityOf2MutationsFormatted, nil
}
//Outputs:
// -bool: Polygenic Disease Risk Score known (any loci values exist)
// -int: Person Disease risk score
// -string: Person Disease risk score formatted (has "/10" suffix)
// -int: Quantity of loci tested
// -bool: Conflict exists
// -bool: Any analysis exists
// -int: Predicted risk score (0-10)
// -map[int]float64: Prediction confidence ranges map
// -Map Structure: Percentage probability of accurate prediction -> distance of range in both directions from prediction
// -int: Quantity of loci known
// -int: Quantity of phased loci
// -bool: Conflict exists (between any of these results for each genome)
// -error
func GetPersonPolygenicDiseaseInfoFromGeneticAnalysis(personAnalysisObject geneticAnalysis.PersonAnalysis, diseaseName string, genomeIdentifier [16]byte)(bool, int, string, int, bool, error){
func GetPersonPolygenicDiseaseInfoFromGeneticAnalysis(personAnalysisObject geneticAnalysis.PersonAnalysis, diseaseName string, genomeIdentifier [16]byte)(bool, int, map[int]float64, int, int, bool, error){
personPolygenicDiseasesMap := personAnalysisObject.PolygenicDiseasesMap
personPolygenicDiseaseInfo, exists := personPolygenicDiseasesMap[diseaseName]
personDiseaseInfoObject, exists := personPolygenicDiseasesMap[diseaseName]
if (exists == false){
return false, 0, "", 0, false, nil
return false, 0, nil, 0, 0, false, nil
}
personPolygenicDiseaseInfoMap := personPolygenicDiseaseInfo.PolygenicDiseaseInfoMap
personDiseaseInfoMap := personDiseaseInfoObject.PolygenicDiseaseInfoMap
conflictExists := personDiseaseInfoObject.ConflictExists
genomePolygenicDiseaseInfo, exists := personPolygenicDiseaseInfoMap[genomeIdentifier]
personGenomeDiseaseInfoObject, exists := personDiseaseInfoMap[genomeIdentifier]
if (exists == false){
return false, 0, "", 0, false, nil
return false, 0, nil, 0, 0, false, nil
}
conflictExists := personPolygenicDiseaseInfo.ConflictExists
predictedRiskScore := personGenomeDiseaseInfoObject.RiskScore
confidenceRangesMap := personGenomeDiseaseInfoObject.ConfidenceRangesMap
quantityOfLociKnown := personGenomeDiseaseInfoObject.QuantityOfLociKnown
quantityOfPhasedLoci := personGenomeDiseaseInfoObject.QuantityOfPhasedLoci
personDiseaseRiskScore := genomePolygenicDiseaseInfo.RiskScore
personDiseaseRiskScoreString := helpers.ConvertIntToString(personDiseaseRiskScore)
personDiseaseRiskScoreFormatted := personDiseaseRiskScoreString + "/10"
quantityOfLociTested := genomePolygenicDiseaseInfo.QuantityOfLociTested
return true, personDiseaseRiskScore, personDiseaseRiskScoreFormatted, quantityOfLociTested, conflictExists, nil
return true, predictedRiskScore, confidenceRangesMap, quantityOfLociKnown, quantityOfPhasedLoci, conflictExists, nil
}
//Outputs:
// -bool: Offspring Disease Risk Score known
// -int: Offspring average disease risk score
// -string: Offspring Disease average risk score formatted (has "/10" suffix)
// -[]int: Sample Offspring Risk Scores List
// -int: Quantity of loci tested
// -bool: Conflict exists
// -bool: Analysis exists
// -int: Average offspring risk score (0-10)
// -map[int]float64: Prediction confidence ranges map
// -int: Quantity of loci known
// -int: Quantity of Parental phased loci
// -[]int: 100 Sample offspring risk scores
// -bool: Conflict exists (Between this genome pair and other genome pairs)
// -error
func GetOffspringPolygenicDiseaseInfoFromGeneticAnalysis(coupleAnalysisObject geneticAnalysis.CoupleAnalysis, diseaseName string, genomePairIdentifier [32]byte)(bool, int, string, []int, int, bool, error){
func GetOffspringPolygenicDiseaseInfoFromGeneticAnalysis(coupleAnalysisObject geneticAnalysis.CoupleAnalysis, diseaseName string, genomePairIdentifier [32]byte)(bool, int, map[int]float64, int, int, []int, bool, error){
couplePolygenicDiseasesMap := coupleAnalysisObject.PolygenicDiseasesMap
offspringDiseasesMap := coupleAnalysisObject.PolygenicDiseasesMap
couplePolygenicDiseaseInfo, exists := couplePolygenicDiseasesMap[diseaseName]
diseaseInfoObject, exists := offspringDiseasesMap[diseaseName]
if (exists == false){
return false, 0, "", nil, 0, false, nil
return false, 0, nil, 0, 0, nil, false, nil
}
polygenicDiseaseInfoMap := couplePolygenicDiseaseInfo.PolygenicDiseaseInfoMap
diseaseInfoMap := diseaseInfoObject.PolygenicDiseaseInfoMap
conflictExists := diseaseInfoObject.ConflictExists
genomePairPolygenicDiseaseInfo, exists := polygenicDiseaseInfoMap[genomePairIdentifier]
genomePairDiseaseInfoObject, exists := diseaseInfoMap[genomePairIdentifier]
if (exists == false){
return false, 0, "", nil, 0, false, nil
return false, 0, nil, 0, 0, nil, false, nil
}
conflictExists := couplePolygenicDiseaseInfo.ConflictExists
offspringAverageRiskScore := genomePairDiseaseInfoObject.OffspringAverageRiskScore
predictionConfidenceRangesMap := genomePairDiseaseInfoObject.PredictionConfidenceRangesMap
quantityOfLociKnown := genomePairDiseaseInfoObject.QuantityOfLociKnown
quantityOfParentalPhasedLoci := genomePairDiseaseInfoObject.QuantityOfParentalPhasedLoci
sampleOffspringRiskScoresList := genomePairDiseaseInfoObject.SampleOffspringRiskScoresList
quantityOfLociTested := genomePairPolygenicDiseaseInfo.QuantityOfLociTested
offspringAverageRiskScore := genomePairPolygenicDiseaseInfo.OffspringAverageRiskScore
offspringAverageRiskScoreString := helpers.ConvertIntToString(offspringAverageRiskScore)
offspringAverageRiskScoreFormatted := offspringAverageRiskScoreString + "/10"
sampleOffspringRiskScoresList := genomePairPolygenicDiseaseInfo.SampleOffspringRiskScoresList
return true, offspringAverageRiskScore, offspringAverageRiskScoreFormatted, sampleOffspringRiskScoresList, quantityOfLociTested, conflictExists, nil
}
//Outputs:
// -bool: Risk Weight and base pair known
// -int: Locus risk weight
// -bool: Locus odds ratio known
// -float64: Locus odds ratio
// -string: Locus odds ratio formatted (with x suffix)
// -error
func GetPersonPolygenicDiseaseLocusInfoFromGeneticAnalysis(personAnalyisObject geneticAnalysis.PersonAnalysis, diseaseName string, locusIdentifier [3]byte, genomeIdentifier [16]byte)(bool, int, bool, float64, string, error){
personPolygenicDiseasesMap := personAnalyisObject.PolygenicDiseasesMap
personPolygenicDiseaseMap, exists := personPolygenicDiseasesMap[diseaseName]
if (exists == false){
return false, 0, false, 0, "", nil
}
personPolygenicDiseaseInfoMap := personPolygenicDiseaseMap.PolygenicDiseaseInfoMap
personGenomePolygenicDiseaseInfo, exists := personPolygenicDiseaseInfoMap[genomeIdentifier]
if (exists == false){
return false, 0, false, 0, "", nil
}
genomeLociInfoMap := personGenomePolygenicDiseaseInfo.LociInfoMap
locusInfoObject, exists := genomeLociInfoMap[locusIdentifier]
if (exists == false){
return false, 0, false, 0, "", nil
}
locusRiskWeight := locusInfoObject.RiskWeight
locusOddsRatioIsKnown := locusInfoObject.OddsRatioIsKnown
if (locusOddsRatioIsKnown == false){
return true, locusRiskWeight, false, 0, "", nil
}
locusOddsRatio := locusInfoObject.OddsRatio
genomeLocusOddsRatioString := helpers.ConvertFloat64ToStringRounded(locusOddsRatio, 2)
locusOddsRatioFormatted := genomeLocusOddsRatioString + "x"
return true, locusRiskWeight, true, locusOddsRatio, locusOddsRatioFormatted, nil
}
//Outputs:
// -bool: Offspring risk weight known
// -int: Offspring risk weight
// -bool: Offspring odds ratio known
// -float64: Offspring odds ratio
// -string: Offspring odds ratio formatted (with + and < from unknownFactors weight sum and x suffix)
// -error
func GetOffspringPolygenicDiseaseLocusInfoFromGeneticAnalysis(coupleAnalysisObject geneticAnalysis.CoupleAnalysis, diseaseName string, locusIdentifier [3]byte, genomePairIdentifier [32]byte)(bool, int, bool, float64, string, error){
offspringPolygenicDiseasesMap := coupleAnalysisObject.PolygenicDiseasesMap
offspringPolygenicDiseaseInfo, exists := offspringPolygenicDiseasesMap[diseaseName]
if (exists == false){
return false, 0, false, 0, "", nil
}
offspringPolygenicDiseaseMap := offspringPolygenicDiseaseInfo.PolygenicDiseaseInfoMap
genomePairPolygenicDiseaseInfo, exists := offspringPolygenicDiseaseMap[genomePairIdentifier]
if (exists == false){
return false, 0, false, 0, "", nil
}
genomePairLociInfoMap := genomePairPolygenicDiseaseInfo.LociInfoMap
locusInfoObject, exists := genomePairLociInfoMap[locusIdentifier]
if (exists == false){
return false, 0, false, 0, "", nil
}
offspringAverageRiskWeight := locusInfoObject.OffspringAverageRiskWeight
offspringOddsRatioIsKnown := locusInfoObject.OffspringOddsRatioIsKnown
if (offspringOddsRatioIsKnown == false){
return true, offspringAverageRiskWeight, false, 0, "", nil
}
offspringAverageOddsRatio := locusInfoObject.OffspringAverageOddsRatio
getOddsRatioFormatted := func()string{
offspringAverageUnknownOddsRatiosWeightSum := locusInfoObject.OffspringAverageUnknownOddsRatiosWeightSum
offspringAverageOddsRatioString := helpers.ConvertFloat64ToStringRounded(offspringAverageOddsRatio, 2)
if (offspringAverageUnknownOddsRatiosWeightSum == 0){
result := offspringAverageOddsRatioString + "x"
return result
}
if (offspringAverageUnknownOddsRatiosWeightSum < 0){
result := "<" + offspringAverageOddsRatioString + "x"
return result
}
// offspringAverageUnknownOddsRatiosWeightSum > 0
result := offspringAverageOddsRatioString + "x+"
return result
}
oddsRatioFormatted := getOddsRatioFormatted()
return true, offspringAverageRiskWeight, true, offspringAverageOddsRatio, oddsRatioFormatted, nil
return true, offspringAverageRiskScore, predictionConfidenceRangesMap, quantityOfLociKnown, quantityOfParentalPhasedLoci, sampleOffspringRiskScoresList, conflictExists, nil
}
//Outputs:
@ -892,6 +773,79 @@ func GetOffspringDiscreteTraitRuleInfoFromGeneticAnalysis(coupleAnalysisObject g
}
//Outputs:
// -bool: Any analysis exists
// -float64: Predicted outcome (Example: Height in centimeters)
// -map[int]float64: Prediction confidence ranges map
// -Map Structure: Percentage probability of accurate prediction -> distance of range in both directions from prediction
// -int: Quantity of loci known
// -int: Quantity of phased loci
// -bool: Conflict exists (between any of these results for each genome)
// -error
func GetPersonNumericTraitInfoFromGeneticAnalysis(personAnalysisObject geneticAnalysis.PersonAnalysis, traitName string, genomeIdentifier [16]byte)(bool, float64, map[int]float64, int, int, bool, error){
personTraitsMap := personAnalysisObject.NumericTraitsMap
personTraitInfoObject, exists := personTraitsMap[traitName]
if (exists == false){
return false, 0, nil, 0, 0, false, nil
}
personTraitInfoMap := personTraitInfoObject.TraitInfoMap
conflictExists := personTraitInfoObject.ConflictExists
personGenomeTraitInfoObject, exists := personTraitInfoMap[genomeIdentifier]
if (exists == false){
return false, 0, nil, 0, 0, false, nil
}
predictedOutcome := personGenomeTraitInfoObject.PredictedOutcome
confidenceRangesMap := personGenomeTraitInfoObject.ConfidenceRangesMap
quantityOfLociKnown := personGenomeTraitInfoObject.QuantityOfLociKnown
quantityOfPhasedLoci := personGenomeTraitInfoObject.QuantityOfPhasedLoci
return true, predictedOutcome, confidenceRangesMap, quantityOfLociKnown, quantityOfPhasedLoci, conflictExists, nil
}
//Outputs:
// -bool: Analysis exists
// -float64: Average offspring outcome
// -map[int]float64: Prediction confidence ranges map
// -int: Quantity of loci known
// -int: Quantity of Parental phased loci
// -[]float64: 100 Sample offspring outcomes
// -bool: Conflict exists (Between this genome pair and other genome pairs)
// -error
func GetOffspringNumericTraitInfoFromGeneticAnalysis(coupleAnalysisObject geneticAnalysis.CoupleAnalysis, traitName string, genomePairIdentifier [32]byte)(bool, float64, map[int]float64, int, int, []float64, bool, error){
offspringTraitsMap := coupleAnalysisObject.NumericTraitsMap
traitInfoObject, exists := offspringTraitsMap[traitName]
if (exists == false){
return false, 0, nil, 0, 0, nil, false, nil
}
traitInfoMap := traitInfoObject.TraitInfoMap
conflictExists := traitInfoObject.ConflictExists
genomePairTraitInfoObject, exists := traitInfoMap[genomePairIdentifier]
if (exists == false){
return false, 0, nil, 0, 0, nil, false, nil
}
offspringAverageOutcome := genomePairTraitInfoObject.OffspringAverageOutcome
predictionConfidenceRangesMap := genomePairTraitInfoObject.PredictionConfidenceRangesMap
quantityOfLociKnown := genomePairTraitInfoObject.QuantityOfLociKnown
quantityOfParentalPhasedLoci := genomePairTraitInfoObject.QuantityOfParentalPhasedLoci
sampleOffspringOutcomesList := genomePairTraitInfoObject.SampleOffspringOutcomesList
return true, offspringAverageOutcome, predictionConfidenceRangesMap, quantityOfLociKnown, quantityOfParentalPhasedLoci, sampleOffspringOutcomesList, conflictExists, nil
}
// We use this function to verify a person genetic analysis is well formed
//TODO: Perform sanity checks on data
func VerifyPersonGeneticAnalysis(personAnalysisObject geneticAnalysis.PersonAnalysis)error{
@ -943,25 +897,9 @@ func VerifyPersonGeneticAnalysis(personAnalysisObject geneticAnalysis.PersonAnal
for _, genomeIdentifier := range allGenomeIdentifiersList{
_, _, _, _, _, err := GetPersonPolygenicDiseaseInfoFromGeneticAnalysis(personAnalysisObject, diseaseName, genomeIdentifier)
_, _, _, _, _, _, err := GetPersonPolygenicDiseaseInfoFromGeneticAnalysis(personAnalysisObject, diseaseName, genomeIdentifier)
if (err != nil) { return err }
}
diseaseLocusObjectsList := diseaseObject.LociList
for _, diseaseLocusObject := range diseaseLocusObjectsList{
locusIdentifierHex := diseaseLocusObject.LocusIdentifier
locusIdentifier, err := encoding.DecodeHexStringTo3ByteArray(locusIdentifierHex)
if (err != nil) { return err }
for _, genomeIdentifier := range allGenomeIdentifiersList{
_, _, _, _, _, err := GetPersonPolygenicDiseaseLocusInfoFromGeneticAnalysis(personAnalysisObject, diseaseName, locusIdentifier, genomeIdentifier)
if (err != nil) { return err }
}
}
}
traitObjectsList, err := traits.GetTraitObjectsList()
@ -995,6 +933,13 @@ func VerifyPersonGeneticAnalysis(personAnalysisObject geneticAnalysis.PersonAnal
if (err != nil) { return err }
}
}
} else {
for _, genomeIdentifier := range allGenomeIdentifiersList{
_, _, _, _, _, _, err := GetPersonNumericTraitInfoFromGeneticAnalysis(personAnalysisObject, traitName, genomeIdentifier)
if (err != nil) { return err }
}
}
}
@ -1059,25 +1004,9 @@ func VerifyCoupleGeneticAnalysis(coupleAnalysisObject geneticAnalysis.CoupleAnal
for _, genomePairIdentifier := range allGenomePairIdentifiersList{
_, _, _, _, _, _, err := GetOffspringPolygenicDiseaseInfoFromGeneticAnalysis(coupleAnalysisObject, diseaseName, genomePairIdentifier)
_, _, _, _, _, _, _, err := GetOffspringPolygenicDiseaseInfoFromGeneticAnalysis(coupleAnalysisObject, diseaseName, genomePairIdentifier)
if (err != nil) { return err }
}
diseaseLocusObjectsList := diseaseObject.LociList
for _, diseaseLocusObject := range diseaseLocusObjectsList{
locusIdentifierHex := diseaseLocusObject.LocusIdentifier
locusIdentifier, err := encoding.DecodeHexStringTo3ByteArray(locusIdentifierHex)
if (err != nil) { return err }
for _, genomePairIdentifier := range allGenomePairIdentifiersList{
_, _, _, _, _, err := GetOffspringPolygenicDiseaseLocusInfoFromGeneticAnalysis(coupleAnalysisObject, diseaseName, locusIdentifier, genomePairIdentifier)
if (err != nil) { return err }
}
}
}
traitObjectsList, err := traits.GetTraitObjectsList()
@ -1111,6 +1040,14 @@ func VerifyCoupleGeneticAnalysis(coupleAnalysisObject geneticAnalysis.CoupleAnal
if (err != nil) { return err }
}
}
} else {
for _, genomePairIdentifier := range allGenomePairIdentifiersList{
_, _, _, _, _, _, _, err := GetOffspringNumericTraitInfoFromGeneticAnalysis(coupleAnalysisObject, traitName, genomePairIdentifier)
if (err != nil) { return err }
}
}
}

View file

@ -57,7 +57,7 @@ type RawGenomeLocusValue struct{
// -bool: IsPhased (allele order corresponds to haplotype)
// -map[int64]RawGenomeLocusValue: RSID -> Locus allele value(s)
// -error (file not readable)
func ReadRawGenomeFile(fileReader io.Reader) (string, int, int64, int64, bool, map[int64]RawGenomeLocusValue, error) {
func ReadRawGenomeFile(fileReader io.Reader)(string, int, int64, int64, bool, map[int64]RawGenomeLocusValue, error) {
validBasesList := []string{"C", "A", "T", "G", "I", "D"}
@ -543,4 +543,131 @@ func ReadRawGenomeFile(fileReader io.Reader) (string, int, int64, int64, bool, m
return "", 0, 0, 0, false, nil, errors.New("Cannot read genome file: File format not known.")
}
type LocusLocation struct{
Chromosome int
Position int
}
// This function reads locus locations from 23andMe genome files
// A locus location is the Chromosome and Position of the locus
//Outputs:
// -bool: Able to read file
// -map[int64]LocusLocation: Map of rsID -> Locus location object
// -error
func ReadRawGenomeFileLocusLocations(fileReader io.Reader)(bool, map[int64]LocusLocation, error){
fileBufioReader := bufio.NewReader(fileReader)
firstLine, err := fileBufioReader.ReadString('\n')
if (err != nil){
// File does not have another line
// Malformed 23andMe genome file: Too short.
return false, nil, nil
}
fileIs23andMe := strings.HasPrefix(firstLine, "# This data file generated by 23andMe at:")
if (fileIs23andMe == false){
// We can only read 23andMe files
return false, nil, nil
}
// Now we advance bufio reader to the snp rows
for {
fileLineString, err := fileBufioReader.ReadString('\n')
if (err != nil){
// File does not have another line
// Malformed 23andMe genome file: Too short.
return false, nil, nil
}
// All SNP rows come after this line:
// "# rsid chromosome position genotype"
lineReached := strings.HasPrefix(fileLineString, "# rsid")
if (lineReached == true){
break
}
}
// Map structure: Locus rsID -> LocusLocation
lociLocationsMap := make(map[int64]LocusLocation)
for {
fileLineString, err := fileBufioReader.ReadString('\n')
if (err != nil){
// File does not have another line
break
}
if (fileLineString == "\n"){
// This is the final line
break
}
fileLineWithoutNewline := strings.TrimSuffix(fileLineString, "\n")
// Rows look like this
// "rs4477212 1 82154 GG"
// "rs571313759 1 1181945 --" (-- means no entry)
// "i3001920 MT 16470 G" (one base is possible)
rowSlice := strings.Split(fileLineWithoutNewline, "\t")
if (len(rowSlice) != 4){
// Malformed 23andMe genome data: Invalid SNP row
return false, nil, nil
}
locusIdentifierString := rowSlice[0]
locusChromosomeString := rowSlice[1]
locusPositionString := rowSlice[2]
//Outputs:
// -bool: rsID found
// -int64: rsID value
getRSIDIdentifier := func()(bool, int64){
stringWithoutPrefix, prefixExists := strings.CutPrefix(locusIdentifierString, "rs")
if (prefixExists == false){
return false, 0
}
rsidInt64, err := helpers.ConvertStringToInt64(stringWithoutPrefix)
if (err != nil){
return false, 0
}
return true, rsidInt64
}
isRSID, locusRSID := getRSIDIdentifier()
if (isRSID == false){
// RSID is unknown.
// It is probably a custom identifier (Example: i713426)
continue
}
locusChromosome, err := helpers.ConvertStringToInt(locusChromosomeString)
if (err != nil){
// It is probably "MT" or "X" chromosome
continue
}
locusPosition, err := helpers.ConvertStringToInt(locusPositionString)
if (err != nil){
// 23andMe file is malformed: Contains invalid locusPosition.
return false, nil, nil
}
locusLocationObject := LocusLocation{
Chromosome: locusChromosome,
Position: locusPosition,
}
lociLocationsMap[locusRSID] = locusLocationObject
}
return true, lociLocationsMap, nil
}

View file

@ -3,6 +3,8 @@ package readRawGenomes_test
import "seekia/internal/genetics/readRawGenomes"
import "seekia/resources/geneticReferences/locusMetadata"
import "seekia/internal/genetics/createRawGenomes"
import "seekia/internal/helpers"
@ -15,6 +17,11 @@ import "strings"
func TestAncestryDNAFileReading(t *testing.T){
err := locusMetadata.InitializeLocusMetadataVariables()
if (err != nil){
t.Fatalf("InitializeLocusMetadataVariables failed: " + err.Error())
}
fileString, expectedFileTimeUnix, numberOfAddedLoci, fileRSIDsMap, err := createRawGenomes.CreateFakeRawGenome_AncestryDNA()
if (err != nil){
t.Fatalf("Failed to create fake AncestryDNA genome: " + err.Error())
@ -65,6 +72,11 @@ func TestAncestryDNAFileReading(t *testing.T){
func Test23andMeFileReading(t *testing.T){
err := locusMetadata.InitializeLocusMetadataVariables()
if (err != nil){
t.Fatalf("InitializeLocusMetadataVariables failed: " + err.Error())
}
newRawGenome, fileCreationTime, fileNumberOfLoci, fileRSIDsMap, err := createRawGenomes.CreateFakeRawGenome_23andMe()
if (err != nil){
t.Fatalf("Failed to create fake 23andMe Genome: " + err.Error())

View file

@ -15,8 +15,16 @@ import "testing"
func TestPersonSampleAnalyses(t *testing.T){
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
traits.InitializeTraitVariables()
err := polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
person1AnalysisObject, err := sampleAnalyses.GetSamplePerson1Analysis()
if (err != nil) {
@ -43,8 +51,16 @@ func TestPersonSampleAnalyses(t *testing.T){
func TestCoupleSampleAnalyses(t *testing.T){
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
traits.InitializeTraitVariables()
err := polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
coupleAnalysisObject, err := sampleAnalyses.GetSampleCoupleAnalysis()
if (err != nil){

View file

@ -49,6 +49,29 @@ func ConvertCentimetersToFeetInchesTranslatedString(centimeters float64)(string,
return "", errors.New("ConvertCentimetersToFeetInchesTranslatedString called with invalid centimeters.")
}
getInchUnits := func()string{
if (inputInches == 1){
result := translation.TranslateTextFromEnglishToMyLanguage("inch")
return result
}
inchesTranslated := translation.TranslateTextFromEnglishToMyLanguage("inches")
return inchesTranslated
}
inchUnits := getInchUnits()
inputInchesString := ConvertFloat64ToStringRounded(inputInches, 1)
if (inputFeet == 0){
formattedResult := inputInchesString + " " + inchUnits
return formattedResult, nil
}
getFeetUnits := func()string{
if (inputFeet <= 1){
@ -64,22 +87,7 @@ func ConvertCentimetersToFeetInchesTranslatedString(centimeters float64)(string,
feetUnits := getFeetUnits()
getInchUnits := func()string{
if (inputInches == 1){
result := translation.TranslateTextFromEnglishToMyLanguage("inch")
return result
}
inchesTranslated := translation.TranslateTextFromEnglishToMyLanguage("inches")
return inchesTranslated
}
inchUnits := getInchUnits()
inputFeetString := ConvertIntToString(inputFeet)
inputInchesString := ConvertFloat64ToStringRounded(inputInches, 1)
formattedResult := inputFeetString + " " + feetUnits + ", " + inputInchesString + " " + inchUnits
@ -1098,34 +1106,11 @@ func DeleteIndexFromStringList(inputList []string, indexToDelete int)([]string,
//Outputs:
// -[]string: New list
// -bool: Deleted any items
func DeleteAllMatchingItemsFromStringList(inputList []string, itemToDelete string)([]string, bool){
func DeleteAllMatchingItemsFromList[E comparable](inputList []E, itemToDelete E)([]E, bool){
listCopy := slices.Clone(inputList)
deletionFunction := func(input string)bool{
if (input == itemToDelete){
return true
}
return false
}
newList := slices.DeleteFunc(listCopy, deletionFunction)
if (len(newList) == len(inputList)){
return newList, false
}
return newList, true
}
//Outputs:
// -[]string: New list
// -bool: Deleted any items
func DeleteAllMatchingItemsFromProfileHashList(inputList [][28]byte, itemToDelete [28]byte)([][28]byte, bool){
listCopy := slices.Clone(inputList)
deletionFunction := func(input [28]byte)bool{
deletionFunction := func(input E)bool{
if (input == itemToDelete){
return true
}
@ -1404,23 +1389,23 @@ func SortIdentityHashListToUnicodeOrder(inputList [][16]byte)error{
identityHashStringsMap[identityHash] = identityHashString
}
compareFunction := func(identityHashA [16]byte, identityHashB [16]byte)int{
compareFunction := func(identityHash1 [16]byte, identityHash2 [16]byte)int{
if (identityHashA == identityHashB){
if (identityHash1 == identityHash2){
return 0
}
identityHashAString, exists := identityHashStringsMap[identityHashA]
identityHash1String, exists := identityHashStringsMap[identityHash1]
if (exists == false){
panic("identityHashA is missing from identityHashStringsMap.")
panic("identityHash1 is missing from identityHashStringsMap.")
}
identityHashBString, exists := identityHashStringsMap[identityHashB]
identityHash2String, exists := identityHashStringsMap[identityHash2]
if (exists == false){
panic("identityHashB is missing from identityHashStringsMap.")
panic("identityHash2 is missing from identityHashStringsMap.")
}
if (identityHashAString < identityHashBString){
if (identityHash1String < identityHash2String){
return -1
}
@ -1948,29 +1933,29 @@ func ConvertFloat64ToRoundedStringWithTranslatedUnits(inputFloat float64)(string
}
// This function takes a number and the min and max range of that number
// It returns a number scaled between a new min and max
func ScaleNumberProportionally(ascending bool, input int, inputMin int, inputMax int, newMin int, newMax int)(int, error){
// This function takes an int and the min and max range of that int
// It returns an int scaled between a new min and max
func ScaleIntProportionally(ascending bool, input int, inputMin int, inputMax int, newMin int, newMax int)(int, error){
if (inputMin == inputMax) {
return inputMin, nil
}
if (inputMin > inputMax) {
return 0, errors.New("ScaleNumberProportionally error: InputMin is greater than inputMax")
return 0, errors.New("ScaleIntProportionally error: InputMin is greater than inputMax")
}
if (input < inputMin) {
return 0, errors.New("ScaleNumberProportionally error: Input is less than inputMin")
return 0, errors.New("ScaleIntProportionally error: Input is less than inputMin")
}
if (input > inputMax) {
return 0, errors.New("ScaleNumberProportionally error: Input is greater than inputMax")
return 0, errors.New("ScaleIntProportionally error: Input is greater than inputMax")
}
if (newMin == newMax) {
return newMin, nil
}
if (newMin > newMin){
return 0, errors.New("ScaleNumberProportionally error: newMin is greater than newMin.")
return 0, errors.New("ScaleIntProportionally error: newMin is greater than newMin.")
}
inputRangePortionLength := input - inputMin
@ -2002,6 +1987,58 @@ func ScaleNumberProportionally(ascending bool, input int, inputMin int, inputMax
return result, nil
}
// This function takes an float64 and the min and max range of that float64
// It returns a float64 scaled between a new min and max
func ScaleFloat64Proportionally(ascending bool, input float64, inputMin float64, inputMax float64, newMin float64, newMax float64)(float64, error){
if (inputMin == inputMax) {
return inputMin, nil
}
if (inputMin > inputMax) {
return 0, errors.New("ScaleFloat64Proportionally error: InputMin is greater than inputMax")
}
if (input < inputMin) {
return 0, errors.New("ScaleFloat64Proportionally error: Input is less than inputMin")
}
if (input > inputMax) {
return 0, errors.New("ScaleFloat64Proportionally error: Input is greater than inputMax")
}
if (newMin == newMax) {
return newMin, nil
}
if (newMin > newMin){
return 0, errors.New("ScaleFloat64Proportionally error: newMin is greater than newMin.")
}
inputRangePortionLength := input - inputMin
inputRangeDistance := inputMax - inputMin
inputRangePortion := inputRangePortionLength/inputRangeDistance
// This represents the portion of our output range that we want to travel across
getOutputRangePortion := func()float64{
if (ascending == true){
return inputRangePortion
}
outputRangePortion := 1 - inputRangePortion
return outputRangePortion
}
outputRangePortion := getOutputRangePortion()
outputRangeDistance := newMax - newMin
outputRangePortionLength := outputRangeDistance * outputRangePortion
result := newMin + outputRangePortionLength
return result, nil
}
func XORTwo32ByteArrays(array1 [32]byte, array2 [32]byte)[32]byte{
var newArray [32]byte
@ -2040,3 +2077,71 @@ func Split32ByteArrayInHalf(inputArray [32]byte)([16]byte, [16]byte){
return piece1, piece2
}
// This function takes a list of ints and a target value, and returns the int in the list that is the closest to that value
// If there is a tie, the function returns the earliest item in the list of the tied elements
func GetClosestIntInList(inputList []int, targetValue int)(int, error){
if (len(inputList) == 0){
return 0, errors.New("GetClosestIntInList called with empty inputList.")
}
closestValue := 0
closestValueDistance := float64(0)
for index, element := range inputList{
if (element == targetValue){
return element, nil
}
distance := math.Abs(float64(element - targetValue))
if (index == 0 || distance < closestValueDistance){
closestValue = element
closestValueDistance = distance
}
}
return closestValue, nil
}
func CountMatchingElementsInSlice[E comparable](inputSlice []E, inputElement E)int{
counter := 0
for _, element := range inputSlice{
if (element == inputElement){
counter += 1
}
}
return counter
}
func CheckIfAllItemsInSliceAreIdentical[E comparable](inputSlice []E)bool{
if (len(inputSlice) <= 1){
return true
}
initialElement := inputSlice[0]
for index, element := range inputSlice{
if (index == 0){
continue
}
if (element != initialElement){
return false
}
}
return true
}

View file

@ -321,86 +321,86 @@ func TestSliceFunctions(t *testing.T){
func TestNumberProportionalScaling(t *testing.T){
result, err := helpers.ScaleNumberProportionally(true, 50, 0, 100, 0, 50)
result, err := helpers.ScaleIntProportionally(true, 50, 0, 100, 0, 50)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 25){
t.Fatalf("ScaleNumberProportionally failed test 1.")
t.Fatalf("ScaleIntProportionally failed test 1.")
}
result, err = helpers.ScaleNumberProportionally(true, 25, 0, 100, 0, 200)
result, err = helpers.ScaleIntProportionally(true, 25, 0, 100, 0, 200)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 50){
t.Fatalf("ScaleNumberProportionally failed test 2.")
t.Fatalf("ScaleIntProportionally failed test 2.")
}
result, err = helpers.ScaleNumberProportionally(false, 25, 0, 100, 0, 200)
result, err = helpers.ScaleIntProportionally(false, 25, 0, 100, 0, 200)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 150){
t.Fatalf("ScaleNumberProportionally failed test 3.")
t.Fatalf("ScaleIntProportionally failed test 3.")
}
result, err = helpers.ScaleNumberProportionally(true, 1, 0, 10, 0, 200)
result, err = helpers.ScaleIntProportionally(true, 1, 0, 10, 0, 200)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 20){
t.Fatalf("ScaleNumberProportionally failed test 4.")
t.Fatalf("ScaleIntProportionally failed test 4.")
}
result, err = helpers.ScaleNumberProportionally(true, -50, -100, 0, 0, 200)
result, err = helpers.ScaleIntProportionally(true, -50, -100, 0, 0, 200)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 100){
t.Fatalf("ScaleNumberProportionally failed test 5.")
t.Fatalf("ScaleIntProportionally failed test 5.")
}
result, err = helpers.ScaleNumberProportionally(true, -25, -100, 0, 0, 200)
result, err = helpers.ScaleIntProportionally(true, -25, -100, 0, 0, 200)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 150){
t.Fatalf("ScaleNumberProportionally failed test 6.")
t.Fatalf("ScaleIntProportionally failed test 6.")
}
result, err = helpers.ScaleNumberProportionally(true, 50, 0, 100, 0, 2)
result, err = helpers.ScaleIntProportionally(true, 50, 0, 100, 0, 2)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 1){
t.Fatalf("ScaleNumberProportionally failed test 7.")
t.Fatalf("ScaleIntProportionally failed test 7.")
}
result, err = helpers.ScaleNumberProportionally(true, 10, 0, 100, 5, 25)
result, err = helpers.ScaleIntProportionally(true, 10, 0, 100, 5, 25)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 7){
t.Fatalf("ScaleNumberProportionally failed test 8.")
t.Fatalf("ScaleIntProportionally failed test 8.")
}
result, err = helpers.ScaleNumberProportionally(true, 100, 0, 100, 2, 22)
result, err = helpers.ScaleIntProportionally(true, 100, 0, 100, 2, 22)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 22){
t.Fatalf("ScaleNumberProportionally failed test 9.")
t.Fatalf("ScaleIntProportionally failed test 9.")
}
result, err = helpers.ScaleNumberProportionally(false, 0, 0, 100, 2, 22)
result, err = helpers.ScaleIntProportionally(false, 0, 0, 100, 2, 22)
if (err != nil){
t.Fatalf("ScaleNumberProportionally failed: " + err.Error())
t.Fatalf("ScaleIntProportionally failed: " + err.Error())
}
if (result != 22){
t.Fatalf("ScaleNumberProportionally failed test 10.")
t.Fatalf("ScaleIntProportionally failed test 10.")
}
}

View file

@ -121,7 +121,7 @@ func GetImageWithEmojiOverlay(inputImage image.Image, emojiImage image.Image, em
imageLongerSideLength := max(imageWidth, imageHeight)
// This is the length of the longest side of the emoji we will draw
emojiLongerSideMaximumLength, err := helpers.ScaleNumberProportionally(true, emojiScale, 0, 100, 1, imageLongerSideLength)
emojiLongerSideMaximumLength, err := helpers.ScaleIntProportionally(true, emojiScale, 0, 100, 1, imageLongerSideLength)
if (err != nil) { return nil, err }
newEmoji, err := imagery.ResizeGolangImage(emojiImage, emojiLongerSideMaximumLength)
@ -130,10 +130,10 @@ func GetImageWithEmojiOverlay(inputImage image.Image, emojiImage image.Image, em
// Now we get the X and Y Coordinate point for where we will draw the emoji
// We first find the center coordinate of the emoji we are drawing
emojiCenterXCoordinate, err := helpers.ScaleNumberProportionally(true, xAxisPercentage, 0, 100, 0, imageWidth)
emojiCenterXCoordinate, err := helpers.ScaleIntProportionally(true, xAxisPercentage, 0, 100, 0, imageWidth)
if (err != nil) { return nil, err }
emojiCenterYCoordinate, err := helpers.ScaleNumberProportionally(false, yAxisPercentage, 0, 100, 0, imageHeight)
emojiCenterYCoordinate, err := helpers.ScaleIntProportionally(false, yAxisPercentage, 0, 100, 0, imageHeight)
if (err != nil) { return nil, err }
emojiWidth, emojiHeight, err := imagery.GetImageWidthAndHeightPixels(newEmoji)

View file

@ -497,7 +497,7 @@ func PixelateGolangImage(inputImage image.Image, amount0to100 int)(image.Image,
longerSideLength := max(widthPixels, heightPixels)
pixelationAmountInt, err := helpers.ScaleNumberProportionally(true, amount0to100, 0, 100, 0, longerSideLength/5)
pixelationAmountInt, err := helpers.ScaleIntProportionally(true, amount0to100, 0, 100, 0, longerSideLength/5)
if (err != nil) { return nil, err }
rectangle := image.Rect(0, 0, widthPixels, heightPixels)

View file

@ -140,12 +140,12 @@ func GetAppDatabaseFolderPath()(string, error){
// Function will create new file or overwite existing:
func CreateOrOverwriteFile(content []byte, folderPath string, filename string) error{
_, err := CreateFolder(folderPath)
if (err != nil) { return err }
filepath := goFilepath.Join(folderPath, filename)
newFile, err := os.Create(filepath)
if (err != nil) { return err }

View file

@ -469,7 +469,7 @@ func StartUpdatingMyConversations(identityType string, networkType byte) error{
// Input is a value between 0-100
// We reduce it down to a value between 0-50
newProgressInt, err := helpers.ScaleNumberProportionally(true, input, 0, 100, 0, 50)
newProgressInt, err := helpers.ScaleIntProportionally(true, input, 0, 100, 0, 50)
if (err != nil) { return err }
newPercentageProgressFloat := float64(newProgressInt)/100
@ -715,7 +715,7 @@ func StartUpdatingMyConversations(identityType string, networkType byte) error{
return nil
}
newScaledPercentageInt, err := helpers.ScaleNumberProportionally(true, index, 0, maximumIndex, 70, 85)
newScaledPercentageInt, err := helpers.ScaleIntProportionally(true, index, 0, maximumIndex, 70, 85)
if (err != nil) { return err }
newProgressFloat := float64(newScaledPercentageInt)/100

View file

@ -268,7 +268,7 @@ func GetUpdatedMyChatMessagesMapList(myIdentityType string, networkType byte, up
for messageHash, messageInbox := range myRawInboxMessageHashesMap{
newPercentageProgress, err := helpers.ScaleNumberProportionally(true, index, 0, maximumIndex, 20, 100)
newPercentageProgress, err := helpers.ScaleIntProportionally(true, index, 0, maximumIndex, 20, 100)
if (err != nil){ return err }
err = updateProgressFunction(newPercentageProgress)

View file

@ -776,7 +776,7 @@ func GetProfileVerdictMaps(profileHash [28]byte, profileNetworkType byte, integr
}
// We can omit the current profileHash, because we are already checking it
attributeProfileHashesList, _ := helpers.DeleteAllMatchingItemsFromProfileHashList(attributeProfilesList, profileHash)
attributeProfileHashesList, _ := helpers.DeleteAllMatchingItemsFromList(attributeProfilesList, profileHash)
return attributeProfileHashesList, nil
}

View file

@ -480,7 +480,7 @@ func StartUpdatingViewedContent(networkType byte)error{
return nil
}
newScaledPercentageInt, err := helpers.ScaleNumberProportionally(true, index, 0, maximumIndex, 50, 80)
newScaledPercentageInt, err := helpers.ScaleIntProportionally(true, index, 0, maximumIndex, 50, 80)
if (err != nil) { return err }
newProgressFloat := float64(newScaledPercentageInt)/100

View file

@ -365,7 +365,7 @@ func StartUpdatingViewedModerators(networkType byte)error{
return nil
}
progressPercentage, err := helpers.ScaleNumberProportionally(true, index, 0, numberOfModerators-1, 20, 50)
progressPercentage, err := helpers.ScaleIntProportionally(true, index, 0, numberOfModerators-1, 20, 50)
if (err != nil) { return err }
progressFloat := float64(progressPercentage)/100
@ -452,7 +452,7 @@ func StartUpdatingViewedModerators(networkType byte)error{
return nil
}
newScaledPercentageInt, err := helpers.ScaleNumberProportionally(true, index, 0, maximumIndex, 50, 80)
newScaledPercentageInt, err := helpers.ScaleIntProportionally(true, index, 0, maximumIndex, 50, 80)
if (err != nil) { return err }
newProgressFloat := float64(newScaledPercentageInt)/100

View file

@ -467,7 +467,7 @@ func DeleteContactCategory(identityType string, categoryToDeleteName string)erro
contactCategoriesListBase64 := strings.Split(contactCategoriesListString, "+")
newCategoriesList, deletedAny := helpers.DeleteAllMatchingItemsFromStringList(contactCategoriesListBase64, categoryToDeleteNameBase64)
newCategoriesList, deletedAny := helpers.DeleteAllMatchingItemsFromList(contactCategoriesListBase64, categoryToDeleteNameBase64)
if (deletedAny == false){
newContactsMapList = append(newContactsMapList, contactMap)
continue

View file

@ -163,7 +163,7 @@ func (listObject *MyList) DeleteListItem(item string)error{
currentList := listObject.memoryList
newList, anyDeleted := helpers.DeleteAllMatchingItemsFromStringList(currentList, item)
newList, anyDeleted := helpers.DeleteAllMatchingItemsFromList(currentList, item)
if (anyDeleted == false){
listObject.memoryMutex.Unlock()
return nil

View file

@ -323,7 +323,7 @@ func StartUpdatingMyMatches(networkType byte)error{
return nil
}
progressPercentage, err := helpers.ScaleNumberProportionally(true, index, 0, maximumIndex, 0, 50)
progressPercentage, err := helpers.ScaleIntProportionally(true, index, 0, maximumIndex, 0, 50)
if (err != nil) { return err }
progressFloat := float64(progressPercentage)/100
@ -449,7 +449,7 @@ func StartUpdatingMyMatches(networkType byte)error{
return nil
}
newScaledPercentageInt, err := helpers.ScaleNumberProportionally(true, index, 0, maximumIndex, 50, 80)
newScaledPercentageInt, err := helpers.ScaleIntProportionally(true, index, 0, maximumIndex, 50, 95)
if (err != nil) { return err }
newProgressFloat := float64(newScaledPercentageInt)/100

View file

@ -1382,11 +1382,19 @@ func TestCreateAndReadRequest_BroadcastContent(t *testing.T){
// We initialize these variables so we can create fake profiles
traits.InitializeTraitVariables()
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
err := profileFormat.InitializeProfileFormatVariables()
err := polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = profileFormat.InitializeProfileFormatVariables()
if (err != nil) {
t.Fatalf("Failed to initialize profile format variables: " + err.Error())
}

View file

@ -326,15 +326,23 @@ func TestCreateAndReadResponse_GetProfilesInfo(t *testing.T){
func TestCreateAndReadResponse_GetProfiles(t *testing.T){
err := profileFormat.InitializeProfileFormatVariables()
monogenicDiseases.InitializeMonogenicDiseaseVariables()
err := polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = profileFormat.InitializeProfileFormatVariables()
if (err != nil) {
t.Fatalf("Failed to initialize profile format variables: " + err.Error())
}
traits.InitializeTraitVariables()
monogenicDiseases.InitializeMonogenicDiseaseVariables()
polygenicDiseases.InitializePolygenicDiseaseVariables()
hostPublicIdentityKey, hostPrivateIdentityKey, err := identity.GetNewRandomPublicPrivateIdentityKeys()
if (err != nil) {
t.Fatalf("Failed to create random identity keys: " + err.Error())

View file

@ -335,7 +335,7 @@ func StartUpdatingViewedHosts(networkType byte)error{
return nil
}
progressPercentage, err := helpers.ScaleNumberProportionally(true, index, 0, numberOfEnabledHosts-1, 0, 50)
progressPercentage, err := helpers.ScaleIntProportionally(true, index, 0, numberOfEnabledHosts-1, 0, 50)
if (err != nil) { return err }
progressFloat := float64(progressPercentage)/100
@ -413,13 +413,13 @@ func StartUpdatingViewedHosts(networkType byte)error{
hostAttributeValuesMap[hostIdentityHashString] = attributeValueFloat
}
isStopped := CheckIfBuildViewedHostsIsStopped()
if (isStopped == true){
return nil
}
newScaledPercentageInt, err := helpers.ScaleNumberProportionally(true, index, 0, maximumIndex, 50, 80)
newScaledPercentageInt, err := helpers.ScaleIntProportionally(true, index, 0, maximumIndex, 50, 80)
if (err != nil) { return err }
newProgressFloat := float64(newScaledPercentageInt)/100
@ -429,47 +429,47 @@ func StartUpdatingViewedHosts(networkType byte)error{
appMemory.SetMemoryEntry("ViewedHostsReadyProgressStatus", newProgressString)
}
compareHostsFunction := func(identityHashA string, identityHashB string)int{
compareHostsFunction := func(identityHash1 string, identityHash2 string)int{
if (identityHashA == identityHashB){
if (identityHash1 == identityHash2){
panic("compareHostsFunction called with duplicate hosts.")
}
attributeValueA, attributeValueAExists := hostAttributeValuesMap[identityHashA]
attributeValue1, attributeValue1Exists := hostAttributeValuesMap[identityHash1]
attributeValueB, attributeValueBExists := hostAttributeValuesMap[identityHashB]
attributeValue2, attributeValue2Exists := hostAttributeValuesMap[identityHash2]
if (attributeValueAExists == false && attributeValueBExists == false){
if (attributeValue1Exists == false && attributeValue2Exists == false){
// We don't know the attribute value for either host
// We sort hosts in unicode order
if (identityHashA < identityHashB){
if (identityHash1 < identityHash2){
return -1
}
return 1
} else if (attributeValueAExists == true && attributeValueBExists == false){
} else if (attributeValue1Exists == true && attributeValue2Exists == false){
// We sort unknown attribute hosts to the back of the list
return -1
} else if (attributeValueAExists == false && attributeValueBExists == true){
} else if (attributeValue1Exists == false && attributeValue2Exists == true){
return 1
}
// Both attribute values exist
if (attributeValueA == attributeValueB){
if (attributeValue1 == attributeValue2){
// We sort identity hashes in unicode order
if (identityHashA < identityHashB){
if (identityHash1 < identityHash2){
return -1
}
return 1
}
if (attributeValueA < attributeValueB){
if (attributeValue1 < attributeValue2){
if (currentSortDirection == "Ascending"){
return -1

View file

@ -5,7 +5,7 @@
package attributeDisplay
//TODO: Deal with singular/multiple values and how that changes an attribute value's units
// For example: 1 variant, 2 variants
// For example: 1 variant, 2 variants, 1 centimeter, 2 centimeters
import "seekia/resources/worldLocations"
import "seekia/resources/worldLanguages"
@ -43,6 +43,7 @@ func GetProfileAttributeDisplayInfo(attributeName string)(string, bool, func(str
}
formatPercentageFunction := func(input string)(string, error){
valueFloat64, err := helpers.ConvertStringToFloat64(input)
if (err != nil){
return "", errors.New("formatPercentageFunction called with non-numeric " + attributeName + " value: " + input)
@ -52,7 +53,8 @@ func GetProfileAttributeDisplayInfo(attributeName string)(string, bool, func(str
return "", errors.New("formatPercentageFunction called with invalid " + attributeName + " percentage value: " + input)
}
result := input + "%"
result := helpers.ConvertIntToString(int(valueFloat64))
return result, nil
}
@ -197,24 +199,6 @@ func GetProfileAttributeDisplayInfo(attributeName string)(string, bool, func(str
return titleTranslated, false, passValueFunction, "", noResponseTranslated, nil
}
case "CatsRating":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Cats Rating")
return titleTranslated, true, passValueFunction, "/10", noResponseTranslated, nil
}
case "DogsRating":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Dogs Rating")
return titleTranslated, true, passValueFunction, "/10", noResponseTranslated, nil
}
case "PetsRating":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Pets Rating")
return titleTranslated, true, passValueFunction, "/10", noResponseTranslated, nil
}
case "GenderIdentity":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Gender Identity")
@ -231,20 +215,23 @@ func GetProfileAttributeDisplayInfo(attributeName string)(string, bool, func(str
return titleTranslated, false, translateGenderFunction, "", noResponseTranslated, nil
}
case "FruitRating",
"VegetablesRating",
"NutsRating",
"GrainsRating",
"DairyRating",
"SeafoodRating",
"BeefRating",
"PorkRating",
"PoultryRating",
"EggsRating",
"BeansRating":{
"VegetablesRating",
"NutsRating",
"GrainsRating",
"DairyRating",
"SeafoodRating",
"BeefRating",
"PorkRating",
"PoultryRating",
"EggsRating",
"BeansRating",
"PetsRating",
"DogsRating",
"CatsRating":{
foodName := strings.TrimSuffix(attributeName, "Rating")
thingName := strings.TrimSuffix(attributeName, "Rating")
attributeTitle := foodName + " Rating"
attributeTitle := thingName + " Rating"
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage(attributeTitle)
@ -422,14 +409,92 @@ func GetProfileAttributeDisplayInfo(attributeName string)(string, bool, func(str
return titleTranslated, true, passValueFunction, "/4", noResponseTranslated, nil
}
case "Height":{
case "Height",
"PredictedHeight",
"OffspringPredictedHeight":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Height")
getAttributeTitle := func()string{
switch attributeName{
case "Height":{
return "Height"
}
case "PredictedHeight":{
return "Predicted Height"
}
}
return "Offspring Predicted Height"
}
attributeTitle := getAttributeTitle()
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage(attributeTitle)
getMyMetricOrImperial := func()(string, error){
exists, metricOrImperial, err := globalSettings.GetSetting("MetricOrImperial")
if (err != nil) { return "", err }
if (exists == false){
return "Metric", nil
}
if (metricOrImperial != "Metric" && metricOrImperial != "Imperial"){
return "", errors.New("Malformed globalSettings: Invalid metricOrImperial: " + metricOrImperial)
}
return metricOrImperial, nil
}
myMetricOrImperial, err := getMyMetricOrImperial()
if (err != nil) { return "", false, nil, "", "", err }
formatHeightFunction := func(input string)(string, error){
inputCentimeters, err := helpers.ConvertStringToFloat64(input)
if (err != nil) { return "", err }
if (myMetricOrImperial == "Metric"){
centimetersString := helpers.ConvertFloat64ToStringRounded(inputCentimeters, 2)
return centimetersString, nil
}
feetInchesString, err := helpers.ConvertCentimetersToFeetInchesTranslatedString(inputCentimeters)
if (err != nil) { return "", err }
return feetInchesString, nil
}
getUnitsTranslated := func()string{
if (myMetricOrImperial == "Metric"){
unitsTranslated := translation.TranslateTextFromEnglishToMyLanguage("Centimeters")
return unitsTranslated
}
// There are no units to add
// The value is in the format "5 feet, 10 inches"
return ""
}
unitsTranslated := getUnitsTranslated()
unitsTranslated := translation.TranslateTextFromEnglishToMyLanguage("centimeters")
unitsWithPadding := " " + unitsTranslated
return titleTranslated, true, roundNumberFunction, unitsWithPadding, noResponseTranslated, nil
getUnavailableText := func()string{
if (attributeName == "Height"){
return noResponseTranslated
}
return unknownTranslated
}
unavailableText := getUnavailableText()
return titleTranslated, true, formatHeightFunction, unitsWithPadding, unavailableText, nil
}
case "Sex":{
@ -452,24 +517,6 @@ func GetProfileAttributeDisplayInfo(attributeName string)(string, bool, func(str
return titleTranslated, false, translateValueFunction, "", unknownTranslated, nil
}
case "OffspringProbabilityOfAnyMonogenicDisease":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Probability Of Any Monogenic Disease")
return titleTranslated, true, formatPercentageFunction, "", unknownTranslated, nil
}
case "TotalPolygenicDiseaseRiskScore":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Total Polygenic Disease Risk Score")
return titleTranslated, true, passValueFunction, "/100", noResponseTranslated, nil
}
case "OffspringTotalPolygenicDiseaseRiskScore":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Total Polygenic Disease Risk Score")
return titleTranslated, true, passValueFunction, "/100", unknownTranslated, nil
}
case "23andMe_AncestryComposition":{
// There is no way to display this as text, we use the gui instead
@ -867,6 +914,24 @@ func GetProfileAttributeDisplayInfo(attributeName string)(string, bool, func(str
return titleTranslated, true, passValueFunction, "%", unknownTranslated, nil
}
case "OffspringProbabilityOfAnyMonogenicDisease":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Probability Of Any Monogenic Disease")
return titleTranslated, true, formatPercentageFunction, "%", unknownTranslated, nil
}
case "TotalPolygenicDiseaseRiskScore":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Total Polygenic Disease Risk Score")
return titleTranslated, true, passValueFunction, "/100", noResponseTranslated, nil
}
case "OffspringTotalPolygenicDiseaseRiskScore":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Total Polygenic Disease Risk Score")
return titleTranslated, true, passValueFunction, "/100", unknownTranslated, nil
}
case "OffspringProbabilityOfAnyMonogenicDisease_NumberOfDiseasesTested":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Probability Of Any Monogenic Disease - Number Of Diseases Tested")
@ -897,6 +962,107 @@ func GetProfileAttributeDisplayInfo(attributeName string)(string, bool, func(str
return titleTranslated, true, passValueFunction, unitsWithPadding, "", nil
}
case "AutismRiskScore",
"OffspringAutismRiskScore",
"ObesityRiskScore",
"OffspringObesityRiskScore",
"HomosexualnessScore",
"OffspringHomosexualnessScore":{
getAttributeTitle := func()(string, error){
switch attributeName{
case "AutismRiskScore":{
return "Autism Risk Score", nil
}
case "OffspringAutismRiskScore":{
return "Offspring Autism Risk Score", nil
}
case "ObesityRiskScore":{
return "Obesity Risk Score", nil
}
case "OffspringObesityRiskScore":{
return "Offspring Obesity Risk Score", nil
}
case "HomosexualnessScore":{
return "Homosexualness Score", nil
}
case "OffspringHomosexualnessScore":{
return "Offspring Homosexualness Score", nil
}
}
return "", errors.New("getAttributeTitle reached with unknown attributeName: " + attributeName)
}
attributeTitle, err := getAttributeTitle()
if (err != nil) { return "", false, nil, "", "", err }
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage(attributeTitle)
return titleTranslated, true, passValueFunction, "/10", unknownTranslated, nil
}
case "PredictedEyeColor":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Predicted Eye Color")
return titleTranslated, false, translateValueFunction, "", unknownTranslated, nil
}
case "PredictedHairTexture":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Predicted Hair Texture")
return titleTranslated, false, translateValueFunction, "", unknownTranslated, nil
}
case "PredictedLactoseTolerance":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Predicted Lactose Tolerance")
return titleTranslated, false, translateValueFunction, "", unknownTranslated, nil
}
case "OffspringBlueEyesProbability":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Blue Eyes Probability")
return titleTranslated, false, formatPercentageFunction, "%", unknownTranslated, nil
}
case "OffspringGreenEyesProbability":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Green Eyes Probability")
return titleTranslated, false, formatPercentageFunction, "%", unknownTranslated, nil
}
case "OffspringHazelEyesProbability":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Hazel Eyes Probability")
return titleTranslated, false, formatPercentageFunction, "%", unknownTranslated, nil
}
case "OffspringBrownEyesProbability":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Brown Eyes Probability")
return titleTranslated, false, formatPercentageFunction, "%", unknownTranslated, nil
}
case "OffspringLactoseToleranceProbability":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Lactose Tolerance Probability")
return titleTranslated, false, formatPercentageFunction, "%", unknownTranslated, nil
}
case "OffspringStraightHairProbability":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Straight Hair Probability")
return titleTranslated, false, formatPercentageFunction, "%", unknownTranslated, nil
}
case "OffspringCurlyHairProbability":{
titleTranslated := translation.TranslateTextFromEnglishToMyLanguage("Offspring Curly Hair Probability")
return titleTranslated, false, formatPercentageFunction, "%", unknownTranslated, nil
}
}
attributeHasMonogenicDiseasePrefix := strings.HasPrefix(attributeName, "MonogenicDisease_")
@ -946,6 +1112,18 @@ func GetProfileAttributeDisplayInfo(attributeName string)(string, bool, func(str
return titleTranslated, false, passValueFunction, "", noResponseTranslated, nil
}
hasLocusIsPhasedPrefix := strings.HasPrefix(attributeName, "LocusIsPhased_rs")
if (hasLocusIsPhasedPrefix == true){
locusRSID := strings.TrimPrefix(attributeName, "LocusIsPhased_")
locusTranslated := translation.TranslateTextFromEnglishToMyLanguage("Locus")
isPhasedTranslated := translation.TranslateTextFromEnglishToMyLanguage("Is Phased")
titleTranslated := locusTranslated + " " + locusRSID + " " + isPhasedTranslated
return titleTranslated, false, translateValueFunction, "", noResponseTranslated, nil
}
return "", false, nil, "", "", errors.New("GetProfileAttributeDisplayInfo called with unknown attributeName: " + attributeName)
}

View file

@ -1,7 +1,11 @@
package attributeDisplay_test
import "seekia/internal/globalSettings"
import "seekia/internal/profiles/attributeDisplay"
import "seekia/resources/geneticReferences/polygenicDiseases"
import "seekia/resources/geneticReferences/traits"
import "seekia/internal/globalSettings"
import "seekia/internal/profiles/calculatedAttributes"
import "seekia/internal/profiles/profileFormat"
@ -15,6 +19,16 @@ func TestGetAttributeDisplayInfo(t *testing.T){
t.Fatalf("InitializeGlobalSettingsDatastore failed: " + err.Error())
}
err = polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = profileFormat.InitializeProfileFormatVariables()
if (err != nil) {
t.Fatalf("InitializeProfileFormatVariables failed: " + err.Error())

View file

@ -46,10 +46,6 @@ import "slices"
//TODO:
// -LastActive
// -OffspringLactoseToleranceProbability
// Used to sort users based on probability of lactose tolerance
// This allows the user to sort matches based on whose offspring is most likely to be lactose tolerant
// -Offspring Probability for all traits
// -DietSimilarity
@ -62,12 +58,6 @@ var calculatedAttributesList = []string{
"Distance",
"IsSameSex",
"23andMe_OffspringNeanderthalVariants",
"OffspringProbabilityOfAnyMonogenicDisease",
"OffspringProbabilityOfAnyMonogenicDisease_NumberOfDiseasesTested",
"TotalPolygenicDiseaseRiskScore",
"TotalPolygenicDiseaseRiskScore_NumberOfDiseasesTested",
"OffspringTotalPolygenicDiseaseRiskScore",
"OffspringTotalPolygenicDiseaseRiskScore_NumberOfDiseasesTested",
"SearchTermsCount",
"HasMessagedMe",
"IHaveMessaged",
@ -95,6 +85,46 @@ var calculatedAttributesList = []string{
"23andMe_MaternalHaplogroupSimilarity",
"23andMe_PaternalHaplogroupSimilarity",
"NumberOfReviews",
"OffspringProbabilityOfAnyMonogenicDisease",
"OffspringProbabilityOfAnyMonogenicDisease_NumberOfDiseasesTested",
// Polygenic Diseases:
"TotalPolygenicDiseaseRiskScore",
"TotalPolygenicDiseaseRiskScore_NumberOfDiseasesTested",
"OffspringTotalPolygenicDiseaseRiskScore",
"OffspringTotalPolygenicDiseaseRiskScore_NumberOfDiseasesTested",
"AutismRiskScore",
"OffspringAutismRiskScore",
"ObesityRiskScore",
"OffspringObesityRiskScore",
// Discrete Traits:
"PredictedEyeColor",
"OffspringBlueEyesProbability",
"OffspringGreenEyesProbability",
"OffspringHazelEyesProbability",
"OffspringBrownEyesProbability",
"PredictedLactoseTolerance",
"OffspringLactoseToleranceProbability",
"PredictedHairTexture",
"OffspringStraightHairProbability",
"OffspringCurlyHairProbability",
// Numeric Traits:
"HomosexualnessScore",
"OffspringHomosexualnessScore",
"PredictedHeight",
"OffspringPredictedHeight",
}
// We use a map for faster lookups
@ -679,43 +709,14 @@ func GetAnyProfileAttributeIncludingCalculated(attributeName string, getProfileA
for _, diseaseObject := range polygenicDiseaseObjectsList{
// Map Structure: Locus rsID -> Locus Value
userDiseaseLocusValuesMap := make(map[int64]locusValue.LocusValue)
diseaseLociList := diseaseObject.LociList
for _, locusObject := range diseaseLociList{
locusRSID := locusObject.LocusRSID
locusRSIDString := helpers.ConvertInt64ToString(locusRSID)
locusValueAttributeName := "LocusValue_rs" + locusRSIDString
userLocusBasePairExists, _, userLocusBasePair, err := getProfileAttributesFunction(locusValueAttributeName)
if (err != nil) { return false, 0, "", err }
if (userLocusBasePairExists == false){
continue
}
userLocusBase1, userLocusBase2, semicolonFound := strings.Cut(userLocusBasePair, ";")
if (semicolonFound == false){
return false, 0, "", errors.New("Database corrupt: Contains profile with invalid " + locusValueAttributeName + " value: " + userLocusBasePair)
}
userLocusValue := locusValue.LocusValue{
Base1Value: userLocusBase1,
Base2Value: userLocusBase2,
//TODO: Share LocusIsPhased information in user profiles and retrieve it into this value
LocusIsPhased: false,
}
userDiseaseLocusValuesMap[locusRSID] = userLocusValue
}
anyLocusTested, userDiseaseRiskScore, _, _, err := createPersonGeneticAnalysis.GetPersonGenomePolygenicDiseaseInfo(diseaseLociList, userDiseaseLocusValuesMap, true)
userDiseaseLocusValuesMap, err := GetUserGenomeLocusValuesMapFromProfile(diseaseLociList, getProfileAttributesFunction)
if (err != nil) { return false, 0, "", err }
if (anyLocusTested == false){
neuralNetworkExists, anyLocusTested, userDiseaseRiskScore, _, _, _, err := createPersonGeneticAnalysis.GetPersonGenomePolygenicDiseaseAnalysis(diseaseObject, userDiseaseLocusValuesMap, true)
if (err != nil) { return false, 0, "", err }
if (neuralNetworkExists == false || anyLocusTested == false){
continue
}
@ -761,11 +762,8 @@ func GetAnyProfileAttributeIncludingCalculated(attributeName string, getProfileA
if (myPersonChosen == false || myGenomesExist == false || myAnalysisIsReady == false){
// We have not linked a genome person and performed a genetic analysis
// The total monogenic disease risk is unknown
// We can still predict disease risk for individual recessive disorders when one person has no variants, but
// not the total probability for all monogenic diseases
// This is because everyone is a carrier for at least some recessive monogenic disorders
//
// The total polygenic disease risk is unknown
return false, profileVersion, "", nil
}
@ -790,41 +788,12 @@ func GetAnyProfileAttributeIncludingCalculated(attributeName string, getProfileA
diseaseLociList := diseaseObject.LociList
// Map Structure: rsID -> Locus Value
userDiseaseLocusValuesMap := make(map[int64]locusValue.LocusValue)
for _, locusObject := range diseaseLociList{
locusRSID := locusObject.LocusRSID
locusRSIDString := helpers.ConvertInt64ToString(locusRSID)
locusValueAttributeName := "LocusValue_rs" + locusRSIDString
userLocusBasePairExists, _, userLocusBasePair, err := getProfileAttributesFunction(locusValueAttributeName)
if (err != nil) { return false, 0, "", err }
if (userLocusBasePairExists == false){
continue
}
userLocusBase1, userLocusBase2, semicolonFound := strings.Cut(userLocusBasePair, ";")
if (semicolonFound == false){
return false, 0, "", errors.New("GetAnyProfileAttributeIncludingCalculated called with profile containing invalid " + locusValueAttributeName + ": " + userLocusBasePair)
}
newLocusValue := locusValue.LocusValue{
Base1Value: userLocusBase1,
Base2Value: userLocusBase2,
//TODO: Share locusIsPhased information in user profiles are put it here
LocusIsPhased: false,
}
userDiseaseLocusValuesMap[locusRSID] = newLocusValue
}
anyLocusValuesTested, offspringAverageRiskScore, _, err := createCoupleGeneticAnalysis.GetOffspringPolygenicDiseaseInfo_Fast(diseaseLociList, myGenomeLocusValuesMap, userDiseaseLocusValuesMap)
userDiseaseLocusValuesMap, err := GetUserGenomeLocusValuesMapFromProfile(diseaseLociList, getProfileAttributesFunction)
if (err != nil) { return false, 0, "", err }
if (anyLocusValuesTested == false){
neuralNetworkExists, anyLocusValuesTested, offspringAverageRiskScore, _, _, _, _, err := createCoupleGeneticAnalysis.GetOffspringPolygenicDiseaseAnalysis(diseaseObject, myGenomeLocusValuesMap, userDiseaseLocusValuesMap)
if (err != nil) { return false, 0, "", err }
if (neuralNetworkExists == false || anyLocusValuesTested == false){
continue
}
@ -851,6 +820,386 @@ func GetAnyProfileAttributeIncludingCalculated(attributeName string, getProfileA
return true, profileVersion, allDiseasesAverageRiskScoreString, nil
}
case "AutismRiskScore",
"ObesityRiskScore":{
// These are polygenic diseases
// We get the risk score for the user
diseaseName := strings.TrimSuffix(attributeName, "RiskScore")
diseaseObject, err := polygenicDiseases.GetPolygenicDiseaseObject(diseaseName)
if (err != nil){ return false, 0, "", err }
diseaseLociList := diseaseObject.LociList
userDiseaseLocusValuesMap, err := GetUserGenomeLocusValuesMapFromProfile(diseaseLociList, getProfileAttributesFunction)
if (err != nil) { return false, 0, "", err }
neuralNetworkExists, anyLocusTested, userDiseaseRiskScore, _, _, _, err := createPersonGeneticAnalysis.GetPersonGenomePolygenicDiseaseAnalysis(diseaseObject, userDiseaseLocusValuesMap, true)
if (err != nil) { return false, 0, "", err }
if (neuralNetworkExists == false){
return false, 0, "", errors.New("Neural network missing for disease: " + diseaseName)
}
if (anyLocusTested == false){
// Disease risk is unknown
return false, profileVersion, "", nil
}
riskScoreString := helpers.ConvertIntToString(userDiseaseRiskScore)
return true, profileVersion, riskScoreString, nil
}
case "OffspringAutismRiskScore",
"OffspringObesityRiskScore":{
// These are polygenic diseases
// We get the risk score for the offspring
myPersonChosen, myGenomesExist, myAnalysisIsReady, myGeneticAnalysisObject, myGenomeIdentifier, _, err := myChosenAnalysis.GetMyChosenMateGeneticAnalysis()
if (err != nil) { return false, 0, "", err }
if (myPersonChosen == false || myGenomesExist == false || myAnalysisIsReady == false){
// We have not linked a genome person and performed a genetic analysis
// All offspring polygenic disease risks are unknown
return false, profileVersion, "", nil
}
_, _, _, _, myGenomesMap, err := readGeneticAnalysis.GetMetadataFromPersonGeneticAnalysis(myGeneticAnalysisObject)
if (err != nil) { return false, 0, "", err }
myGenomeLocusValuesMap, exists := myGenomesMap[myGenomeIdentifier]
if (exists == false){
return false, 0, "", errors.New("GetMyChosenMateGeneticAnalysis returning genetic analysis which has GenomesMap which is missing my genome identifier.")
}
diseaseNameWithOffspring := strings.TrimSuffix(attributeName, "RiskScore")
diseaseName := strings.TrimPrefix(diseaseNameWithOffspring, "Offspring")
diseaseObject, err := polygenicDiseases.GetPolygenicDiseaseObject(diseaseName)
if (err != nil){ return false, 0, "", err }
diseaseLociList := diseaseObject.LociList
userDiseaseLocusValuesMap, err := GetUserGenomeLocusValuesMapFromProfile(diseaseLociList, getProfileAttributesFunction)
if (err != nil) { return false, 0, "", err }
neuralNetworkExists, anyLocusValuesTested, offspringAverageRiskScore, _, _, _, _, err := createCoupleGeneticAnalysis.GetOffspringPolygenicDiseaseAnalysis(diseaseObject, myGenomeLocusValuesMap, userDiseaseLocusValuesMap)
if (err != nil) { return false, 0, "", err }
if (neuralNetworkExists == false){
return false, 0, "", errors.New("No neural network exists for disease: " + diseaseName)
}
if (anyLocusValuesTested == false){
// No disease loci are known
return false, profileVersion, "", nil
}
offspringAverageRiskScoreString := helpers.ConvertIntToString(offspringAverageRiskScore)
return true, profileVersion, offspringAverageRiskScoreString, nil
}
case "PredictedEyeColor",
"PredictedLactoseTolerance",
"PredictedHairTexture":{
//Outputs:
// -string: Trait name
getTraitName := func()(string, error){
switch attributeName{
case "PredictedEyeColor":{
return "Eye Color", nil
}
case "PredictedLactoseTolerance":{
return "Lactose Tolerance", nil
}
case "PredictedHairTexture":{
return "Hair Texture", nil
}
}
return "", errors.New("Discrete trait calculated attribute reached with unknown attributeName: " + attributeName)
}
traitName, err := getTraitName()
if (err != nil) { return false, 0, "", err }
traitObject, err := traits.GetTraitObject(traitName)
if (err != nil) { return false, 0, "", err }
traitLociList := traitObject.LociList
userTraitLocusValuesMap, err := GetUserGenomeLocusValuesMapFromProfile(traitLociList, getProfileAttributesFunction)
if (err != nil) { return false, 0, "", err }
neuralNetworkExists, neuralNetworkOutcomeIsKnown, predictedOutcome, _, _, _, err := createPersonGeneticAnalysis.GetGenomeDiscreteTraitAnalysis_NeuralNetwork(traitObject, userTraitLocusValuesMap, true)
if (err != nil) { return false, 0, "", err }
if (neuralNetworkExists == true){
if (neuralNetworkOutcomeIsKnown == false){
return false, 0, "", nil
}
return true, profileVersion, predictedOutcome, nil
}
anyRulesExist, _, _, _, predictedOutcomeExists, predictedOutcome, err := createPersonGeneticAnalysis.GetGenomeDiscreteTraitAnalysis_Rules(traitObject, userTraitLocusValuesMap, true)
if (err != nil) { return false, 0, "", err }
if (anyRulesExist == false){
return false, 0, "", errors.New("Discrete trait calculated attribute exists for trait without a neural network or rules: " + traitName)
}
if (predictedOutcomeExists == false){
return false, 0, "", nil
}
return true, profileVersion, predictedOutcome, nil
}
case "OffspringBlueEyesProbability",
"OffspringGreenEyesProbability",
"OffspringHazelEyesProbability",
"OffspringBrownEyesProbability",
"OffspringLactoseToleranceProbability",
"OffspringStraightHairProbability",
"OffspringCurlyHairProbability":{
//TODO: Add ability to retrieve confidence/quantity of loci known and sort/filter by those attributes
myPersonChosen, myGenomesExist, myAnalysisIsReady, myGeneticAnalysisObject, myGenomeIdentifier, _, err := myChosenAnalysis.GetMyChosenMateGeneticAnalysis()
if (err != nil) { return false, 0, "", err }
if (myPersonChosen == false || myGenomesExist == false || myAnalysisIsReady == false){
// We have not linked a genome person and performed a genetic analysis
// All offspring trait predictions are unknown
return false, profileVersion, "", nil
}
_, _, _, _, myGenomesMap, err := readGeneticAnalysis.GetMetadataFromPersonGeneticAnalysis(myGeneticAnalysisObject)
if (err != nil) { return false, 0, "", err }
myGenomeLocusValuesMap, exists := myGenomesMap[myGenomeIdentifier]
if (exists == false){
return false, 0, "", errors.New("GetMyChosenMateGeneticAnalysis returning genetic analysis which has GenomesMap which is missing my genome identifier.")
}
// These are discrete traits
// We get the outcome probability for the offspring
//Outputs:
// -string: Trait name
// -string: Outcome name
getTraitAndOutcomeName := func()(string, string, error){
switch attributeName{
case "OffspringBlueEyesProbability":{
return "Eye Color", "Blue", nil
}
case "OffspringGreenEyesProbability":{
return "Eye Color", "Green", nil
}
case "OffspringHazelEyesProbability":{
return "Eye Color", "Hazel", nil
}
case "OffspringBrownEyesProbability":{
return "Eye Color", "Brown", nil
}
case "OffspringLactoseToleranceProbability":{
return "Lactose Tolerance", "Tolerant", nil
}
case "OffspringStraightHairProbability":{
return "Hair Texture", "Straight", nil
}
case "OffspringCurlyHairProbability":{
return "Hair Texture", "Curly", nil
}
}
return "", "", errors.New("Offspring discrete trait calculated attribute reached with unknown attributeName: " + attributeName)
}
traitName, outcomeName, err := getTraitAndOutcomeName()
if (err != nil) { return false, 0, "", err }
traitObject, err := traits.GetTraitObject(traitName)
if (err != nil) { return false, 0, "", err }
traitLociList := traitObject.LociList
userTraitLocusValuesMap, err := GetUserGenomeLocusValuesMapFromProfile(traitLociList, getProfileAttributesFunction)
if (err != nil) { return false, 0, "", err }
neuralNetworkExists, anyLociKnown, outcomeProbabilitiesMap, _, _, _, err := createCoupleGeneticAnalysis.GetOffspringDiscreteTraitAnalysis_NeuralNetwork(traitObject, myGenomeLocusValuesMap, userTraitLocusValuesMap)
if (err != nil) { return false, 0, "", err }
if (neuralNetworkExists == true){
if (anyLociKnown == false){
// Trait prediction is not possible
return false, 0, "", nil
}
outcomeProbability, exists := outcomeProbabilitiesMap[outcomeName]
if (exists == false){
return true, profileVersion, "0", nil
}
outcomeProbabilityString := helpers.ConvertIntToString(outcomeProbability)
return true, profileVersion, outcomeProbabilityString, nil
}
anyRulesExist, rulesAnalysisExists, _, _, _, outcomeProbabilitiesMap, err := createCoupleGeneticAnalysis.GetOffspringDiscreteTraitAnalysis_Rules(traitObject, myGenomeLocusValuesMap, userTraitLocusValuesMap)
if (err != nil) { return false, 0, "", err }
if (anyRulesExist == false){
return false, 0, "", errors.New("Calculation of offspring attribute for discrete trait called with trait missing neural network or rules: " + traitName)
}
if (rulesAnalysisExists == false){
// Analysis is impossible for this trait
return false, 0, "", nil
}
outcomeProbability, exists := outcomeProbabilitiesMap[outcomeName]
if (exists == false){
return true, profileVersion, "0", nil
}
outcomeProbabilityString := helpers.ConvertIntToString(outcomeProbability)
return true, profileVersion, outcomeProbabilityString, nil
}
case "HomosexualnessScore",
"PredictedHeight":{
// These are numeric traits
// We calculate the value for the user
//Outputs:
// -string: Trait name
// -bool: Is a score between 0-10
// -error
getTraitNameAndIsAScoreBool := func()(string, bool, error){
switch attributeName{
case "HomosexualnessScore":{
return "Homosexualness", true, nil
}
case "PredictedHeight":{
return "Height", false, nil
}
}
return "", false, errors.New("User numeric trait value calculation called with unknown attributeName: " + attributeName)
}
traitName, traitIsAScore, err := getTraitNameAndIsAScoreBool()
if (err != nil) { return false, 0, "", err }
traitObject, err := traits.GetTraitObject(traitName)
if (err != nil) { return false, 0, "", err }
traitLociList := traitObject.LociList
userTraitLocusValuesMap, err := GetUserGenomeLocusValuesMapFromProfile(traitLociList, getProfileAttributesFunction)
if (err != nil) { return false, 0, "", err }
traitNeuralNetworkExists, anyLocusValuesAreKnown, predictedOutcome, _, _, _, err := createPersonGeneticAnalysis.GetGenomeNumericTraitAnalysis(traitObject, userTraitLocusValuesMap, true)
if (err != nil) { return false, 0, "", err }
if (traitNeuralNetworkExists == false){
return false, 0, "", errors.New("Numeric trait attribute calculation reached for trait with no neural network: " + traitName)
}
if (anyLocusValuesAreKnown == false){
// Trait prediction is impossible
return false, 0, "", nil
}
if (traitIsAScore == true){
predictedScoreString := helpers.ConvertIntToString(int(predictedOutcome))
return true, profileVersion, predictedScoreString, nil
}
predictedOutcomeString := helpers.ConvertFloat64ToString(predictedOutcome)
return true, profileVersion, predictedOutcomeString, nil
}
case "OffspringHomosexualnessScore",
"OffspringPredictedHeight":{
// These are numeric traits
// We calculate the value for the offspring
myPersonChosen, myGenomesExist, myAnalysisIsReady, myGeneticAnalysisObject, myGenomeIdentifier, _, err := myChosenAnalysis.GetMyChosenMateGeneticAnalysis()
if (err != nil) { return false, 0, "", err }
if (myPersonChosen == false || myGenomesExist == false || myAnalysisIsReady == false){
// We have not linked a genome person and performed a genetic analysis
// All offspring trait predictions are unknown
return false, profileVersion, "", nil
}
_, _, _, _, myGenomesMap, err := readGeneticAnalysis.GetMetadataFromPersonGeneticAnalysis(myGeneticAnalysisObject)
if (err != nil) { return false, 0, "", err }
myGenomeLocusValuesMap, exists := myGenomesMap[myGenomeIdentifier]
if (exists == false){
return false, 0, "", errors.New("GetMyChosenMateGeneticAnalysis returning genetic analysis which has GenomesMap which is missing my genome identifier.")
}
//Outputs:
// -string: Trait name
// -bool: Is a score between 0-10
// -error
getTraitNameAndIsAScoreBool := func()(string, bool, error){
switch attributeName{
case "OffspringHomosexualnessScore":{
return "Homosexualness", true, nil
}
case "OffspringPredictedHeight":{
return "Height", false, nil
}
}
return "", false, errors.New("Offspring numeric trait value calculation called with unknown attributeName: " + attributeName)
}
traitName, traitIsAScore, err := getTraitNameAndIsAScoreBool()
if (err != nil) { return false, 0, "", err }
traitObject, err := traits.GetTraitObject(traitName)
if (err != nil) { return false, 0, "", err }
traitLociList := traitObject.LociList
userTraitLocusValuesMap, err := GetUserGenomeLocusValuesMapFromProfile(traitLociList, getProfileAttributesFunction)
if (err != nil) { return false, 0, "", err }
neuralNetworkExists, anyLociKnown, predictedOutcome, _, _, _, _, err := createCoupleGeneticAnalysis.GetOffspringNumericTraitAnalysis(traitObject, userTraitLocusValuesMap, myGenomeLocusValuesMap)
if (err != nil) { return false, 0, "", err }
if (neuralNetworkExists == false){
return false, 0, "", errors.New("Offspring Numeric trait attribute calculation reached for trait with no neural network: " + traitName)
}
if (anyLociKnown == false){
// Prediction is impossible
return false, profileVersion, "", nil
}
if (traitIsAScore == true){
predictedScoreString := helpers.ConvertIntToString(int(predictedOutcome))
return true, profileVersion, predictedScoreString, nil
}
predictedOutcomeString := helpers.ConvertFloat64ToString(predictedOutcome)
return true, profileVersion, predictedOutcomeString, nil
}
case "SearchTermsCount":{
myDesireExists, myDesiredChoicesListString, err := myLocalDesires.GetDesire("SearchTerms")
@ -1629,7 +1978,55 @@ func GetAnyProfileAttributeIncludingCalculated(attributeName string, getProfileA
return false, 0, "", errors.New("GetAnyProfileAttributeIncludingCalculated called with unknown attribute: " + attributeName)
}
// This function constructs a traitLocusValuesMap from a user's profile and a set of loci
//Outputs:
// -map[int64]locusValue.LocusValue: Genome map for provided loci
// -error
func GetUserGenomeLocusValuesMapFromProfile(lociList []int64, getProfileAttributesFunction func(string)(bool, int, string, error))(map[int64]locusValue.LocusValue, error){
// We construct the user's locus values map
// Map Structure: Locus rsID -> locusValue.LocusValue
userTraitLocusValuesMap := make(map[int64]locusValue.LocusValue)
for _, rsID := range lociList{
rsIDString := helpers.ConvertInt64ToString(rsID)
userLocusValueAttributeName := "LocusValue_rs" + rsIDString
userLocusValueIsKnown, _, userLocusValue, err := getProfileAttributesFunction(userLocusValueAttributeName)
if (err != nil) { return nil, err }
if (userLocusValueIsKnown == false){
continue
}
userLocusBase1, userLocusBase2, semicolonFound := strings.Cut(userLocusValue, ";")
if (semicolonFound == false){
return nil, errors.New("Database corrupt: Contains profile with invalid " + userLocusValueAttributeName + " value: " + userLocusValue)
}
userLocusIsPhasedAttributeName := "LocusIsPhased_rs" + rsIDString
userLocusIsPhasedExists, _, userLocusIsPhasedString, err := getProfileAttributesFunction(userLocusIsPhasedAttributeName)
if (err != nil) { return nil, err }
if (userLocusIsPhasedExists == false){
return nil, errors.New("Database corrupt: Contains profile with locusValue but not locusIsPhased status for locus: " + rsIDString)
}
userLocusIsPhased, err := helpers.ConvertYesOrNoStringToBool(userLocusIsPhasedString)
if (err != nil) { return nil, err }
userLocusValueObject := locusValue.LocusValue{
Base1Value: userLocusBase1,
Base2Value: userLocusBase2,
LocusIsPhased: userLocusIsPhased,
}
userTraitLocusValuesMap[rsID] = userLocusValueObject
}
return userTraitLocusValuesMap, nil
}

View file

@ -26,7 +26,7 @@ func TestCalculatedAttributes(t *testing.T){
calculatedAttributesList := calculatedAttributes.GetCalculatedAttributesList()
for i:=0; i<100; i++{
for i:=0; i<10; i++{
identityPublicKey, identityPrivateKey, err := identity.GetNewRandomPublicPrivateIdentityKeys()
if (err != nil){

View file

@ -153,11 +153,6 @@ func UpdateMyExportedProfile(myProfileType string, networkType byte)error{
_, _, _, _, myGenomesMap, err := readGeneticAnalysis.GetMetadataFromPersonGeneticAnalysis(myGeneticAnalysisObject)
if (err != nil) { return err }
myGenomeLocusValuesMap, exists := myGenomesMap[genomeIdentifierToShare]
if (exists == false){
return errors.New("GetMyChosenMateGeneticAnalysis returning genetic analysis which has GenomesMap which is missing my genome identifier.")
}
monogenicDiseaseNamesList, err := monogenicDiseases.GetMonogenicDiseaseNamesList()
if (err != nil) { return err }
@ -196,7 +191,11 @@ func UpdateMyExportedProfile(myProfileType string, networkType byte)error{
}
polygenicDiseaseObjectsList, err := polygenicDiseases.GetPolygenicDiseaseObjectsList()
if (err != nil) { return err }
if (err != nil) { return err }
// This map stores the rsIDs to share in our profile
// We use a map to avoid duplicates
myLociToShareMap := make(map[int64]struct{})
for _, diseaseObject := range polygenicDiseaseObjectsList{
@ -215,22 +214,9 @@ func UpdateMyExportedProfile(myProfileType string, networkType byte)error{
lociList := diseaseObject.LociList
for _, locusObject := range lociList{
for _, rsID := range lociList{
locusRSID := locusObject.LocusRSID
locusValueObject, exists := myGenomeLocusValuesMap[locusRSID]
if (exists == true){
rsIDString := helpers.ConvertInt64ToString(locusRSID)
locusBase1 := locusValueObject.Base1Value
locusBase2 := locusValueObject.Base2Value
basePairValue := locusBase1 + ";" + locusBase2
profileMap["LocusValue_rs" + rsIDString] = basePairValue
}
myLociToShareMap[rsID] = struct{}{}
}
}
@ -256,21 +242,38 @@ func UpdateMyExportedProfile(myProfileType string, networkType byte)error{
for _, rsID := range lociList{
locusValueObject, exists := myGenomeLocusValuesMap[rsID]
if (exists == true){
rsIDString := helpers.ConvertInt64ToString(rsID)
locusBase1 := locusValueObject.Base1Value
locusBase2 := locusValueObject.Base2Value
basePairValue := locusBase1 + ";" + locusBase2
profileMap["LocusValue_rs" + rsIDString] = basePairValue
}
myLociToShareMap[rsID] = struct{}{}
}
}
myGenomeLocusValuesMap, exists := myGenomesMap[genomeIdentifierToShare]
if (exists == false){
return errors.New("GetMyChosenMateGeneticAnalysis returning genetic analysis which has GenomesMap which is missing my genome identifier.")
}
for rsID, _ := range myLociToShareMap{
locusValueObject, exists := myGenomeLocusValuesMap[rsID]
if (exists == false){
continue
}
rsIDString := helpers.ConvertInt64ToString(rsID)
locusBase1 := locusValueObject.Base1Value
locusBase2 := locusValueObject.Base2Value
locusIsPhased := locusValueObject.LocusIsPhased
basePairValue := locusBase1 + ";" + locusBase2
locisIsPhasedString := helpers.ConvertBoolToYesOrNoString(locusIsPhased)
locusValueAttributeName := "LocusValue_rs" + rsIDString
locusIsPhasedAttributeName := "LocusIsPhased_rs" + rsIDString
profileMap[locusValueAttributeName] = basePairValue
profileMap[locusIsPhasedAttributeName] = locisIsPhasedString
}
return nil
}

View file

@ -10,6 +10,8 @@ package profileFormat
// The order was tarnished after I added and removed some attributes
import "seekia/resources/currencies"
import "seekia/resources/geneticReferences/traits"
import "seekia/resources/geneticReferences/polygenicDiseases"
import "seekia/resources/imageFiles"
import "seekia/resources/worldLanguages"
import "seekia/resources/worldLocations"
@ -93,7 +95,8 @@ type AttributeObject struct{
// This must be run once upon application startup
func InitializeProfileFormatVariables()error{
initializeProfileAttributeObjectsList()
err := initializeProfileAttributeObjectsList()
if (err != nil) { return err }
profileAttributeObjectsList, err := GetProfileAttributeObjectsList()
if (err != nil) { return err }
@ -219,7 +222,7 @@ func GetProfileAttributeObjectsList()([]AttributeObject, error){
return profileAttributeObjectsList, nil
}
func initializeProfileAttributeObjectsList(){
func initializeProfileAttributeObjectsList()error{
// Below are some standard getAttribute functions
@ -2231,6 +2234,52 @@ func initializeProfileAttributeObjectsList(){
addMonogenicDiseaseNumberOfVariantsTestedAttribute(61, "MonogenicDisease_Sickle_Cell_Anemia_NumberOfVariantsTested")
addMonogenicDiseaseVariantProbabilityAttribute(62, "MonogenicDisease_Sickle_Cell_Anemia_ProbabilityOfPassingAVariant")
// TODO: Change attributeIdentifiers so:
// -Polygenic diseases are allotted the range: 1000 - 1999
// -Monogenic diseases are allotted the range: 2000 - 9,999
// -rsIDs are allotted the range: 10,000 - 3,000,000 (profiles will probably never share more than 500,000 loci)
// We build the profile from the traits/polygenic diseases objects list
// This approach is temporary
// Once we have profile versions on a testnet/mainnet, we have to keep the loci static for each profile version
// For now, profile encodings will change whenever we add/remove locus metadata
// This map will store all rsIDs for traits and polygenic diseases
shareableRSIDsMap := make(map[int64]struct{})
traitObjectsList, err := traits.GetTraitObjectsList()
if (err != nil){ return err }
for _, traitObject := range traitObjectsList{
traitLociList := traitObject.LociList
for _, rsID := range traitLociList{
shareableRSIDsMap[rsID] = struct{}{}
}
}
polygenicDiseaseObjectsList, err := polygenicDiseases.GetPolygenicDiseaseObjectsList()
if (err != nil) { return err }
for _, diseaseObject := range polygenicDiseaseObjectsList{
diseaseLociList := diseaseObject.LociList
for _, rsID := range diseaseLociList{
shareableRSIDsMap[rsID] = struct{}{}
}
}
shareableRSIDsList := helpers.GetListOfMapKeys(shareableRSIDsMap)
// We sort rsIDs so they are always in the same order
slices.Sort(shareableRSIDsList)
validBasesList := []string{"C", "A", "T", "G", "I", "D"}
checkValueFunction_GenomeBasePair := func(profileVersion int, profileType string, input string)(bool, bool, error){
@ -2261,14 +2310,23 @@ func initializeProfileAttributeObjectsList(){
return true, true, nil
}
addLocusValueAttributeObject := func(attributeIdentifier int, attributeName string){
addLocusValueAttributeObject := func(attributeIdentifier int, attributeName string, mandatoryAttributeName string){
getMandatoryAttributeFunction := func(profileVersion int)([]string, error){
if (profileVersion != 1){
return nil, errors.New("Trying to retrieve mandatory attributes for unknown profile version.")
}
mandatoryAttributesList := []string{mandatoryAttributeName}
return mandatoryAttributesList, nil
}
attributeObject_LocusValueBasePair := AttributeObject{
ProfileVersions: []int{1},
AttributeIdentifier: attributeIdentifier,
AttributeName: attributeName,
GetIsRequired: getIsRequired_No,
GetMandatoryAttributes: getMandatoryAttributes_None,
GetMandatoryAttributes: getMandatoryAttributeFunction,
GetProfileTypes: getProfileTypes_Mate,
GetIsCanonical: getIsCanonical_Always,
CheckValueFunction: checkValueFunction_GenomeBasePair,
@ -2277,267 +2335,52 @@ func initializeProfileAttributeObjectsList(){
attributeObjectsList = append(attributeObjectsList, attributeObject_LocusValueBasePair)
}
// TODO: Add LocusIsPhased to each rsID
// Change attributeIdentifiers so:
// -Polygenic diseases are allotted the range: 1000 - 1999
// -Monogenic diseases are allotted the range: 2000 - 9,999
// -rsIDs are allotted the range: 10,000 - 3,000,000 (profiles will probably never share more than 500,000 loci)
addLocusIsPhasedAttributeObject := func(attributeIdentifier int, attributeName string, mandatoryAttributeName string){
addLocusValueAttributeObject(500, "LocusValue_rs16942")
addLocusValueAttributeObject(501, "LocusValue_rs1045485")
addLocusValueAttributeObject(502, "LocusValue_rs34330")
addLocusValueAttributeObject(503, "LocusValue_rs144848")
addLocusValueAttributeObject(504, "LocusValue_rs766173")
addLocusValueAttributeObject(505, "LocusValue_rs1799950")
addLocusValueAttributeObject(506, "LocusValue_rs4986850")
addLocusValueAttributeObject(507, "LocusValue_rs2227945")
addLocusValueAttributeObject(508, "LocusValue_rs1799966")
addLocusValueAttributeObject(509, "LocusValue_rs4987117")
addLocusValueAttributeObject(510, "LocusValue_rs1799954")
addLocusValueAttributeObject(511, "LocusValue_rs11571746")
addLocusValueAttributeObject(512, "LocusValue_rs4987047")
addLocusValueAttributeObject(513, "LocusValue_rs11571833")
addLocusValueAttributeObject(514, "LocusValue_rs1801426")
addLocusValueAttributeObject(515, "LocusValue_rs3218707")
addLocusValueAttributeObject(516, "LocusValue_rs4987945")
addLocusValueAttributeObject(517, "LocusValue_rs4986761")
addLocusValueAttributeObject(518, "LocusValue_rs3218695")
addLocusValueAttributeObject(519, "LocusValue_rs1800056")
addLocusValueAttributeObject(520, "LocusValue_rs1800057")
addLocusValueAttributeObject(521, "LocusValue_rs3092856")
addLocusValueAttributeObject(522, "LocusValue_rs1800058")
addLocusValueAttributeObject(523, "LocusValue_rs1801673")
addLocusValueAttributeObject(524, "LocusValue_rs17879961")
addLocusValueAttributeObject(525, "LocusValue_rs182549")
addLocusValueAttributeObject(526, "LocusValue_rs4988235")
addLocusValueAttributeObject(527, "LocusValue_rs7349332")
addLocusValueAttributeObject(528, "LocusValue_rs11803731")
addLocusValueAttributeObject(529, "LocusValue_rs17646946")
addLocusValueAttributeObject(530, "LocusValue_rs11571747")
addLocusValueAttributeObject(531, "LocusValue_rs7779616")
addLocusValueAttributeObject(532, "LocusValue_rs892839")
addLocusValueAttributeObject(533, "LocusValue_rs1003719")
addLocusValueAttributeObject(534, "LocusValue_rs7617069")
addLocusValueAttributeObject(535, "LocusValue_rs7174027")
addLocusValueAttributeObject(536, "LocusValue_rs989869")
addLocusValueAttributeObject(537, "LocusValue_rs2342494")
addLocusValueAttributeObject(538, "LocusValue_rs1158810")
addLocusValueAttributeObject(539, "LocusValue_rs1800414")
addLocusValueAttributeObject(540, "LocusValue_rs1540771")
addLocusValueAttributeObject(541, "LocusValue_rs26722")
addLocusValueAttributeObject(542, "LocusValue_rs1939707")
addLocusValueAttributeObject(543, "LocusValue_rs1800401")
addLocusValueAttributeObject(544, "LocusValue_rs17184180")
addLocusValueAttributeObject(545, "LocusValue_rs35051352")
addLocusValueAttributeObject(546, "LocusValue_rs1800422")
addLocusValueAttributeObject(547, "LocusValue_rs784416")
addLocusValueAttributeObject(548, "LocusValue_rs7803030")
addLocusValueAttributeObject(549, "LocusValue_rs16977009")
addLocusValueAttributeObject(550, "LocusValue_rs622330")
addLocusValueAttributeObject(551, "LocusValue_rs16863422")
addLocusValueAttributeObject(552, "LocusValue_rs12896399")
addLocusValueAttributeObject(553, "LocusValue_rs2422239")
addLocusValueAttributeObject(554, "LocusValue_rs7495174")
addLocusValueAttributeObject(555, "LocusValue_rs13016869")
addLocusValueAttributeObject(556, "LocusValue_rs2835630")
addLocusValueAttributeObject(557, "LocusValue_rs3809761")
addLocusValueAttributeObject(558, "LocusValue_rs11636232")
addLocusValueAttributeObject(559, "LocusValue_rs1805008")
addLocusValueAttributeObject(560, "LocusValue_rs3212368")
addLocusValueAttributeObject(561, "LocusValue_rs894883")
addLocusValueAttributeObject(562, "LocusValue_rs10266101")
addLocusValueAttributeObject(563, "LocusValue_rs911015")
addLocusValueAttributeObject(564, "LocusValue_rs974448")
addLocusValueAttributeObject(565, "LocusValue_rs6950754")
addLocusValueAttributeObject(566, "LocusValue_rs28777")
addLocusValueAttributeObject(567, "LocusValue_rs11855019")
addLocusValueAttributeObject(568, "LocusValue_rs1042602")
addLocusValueAttributeObject(569, "LocusValue_rs1887276")
addLocusValueAttributeObject(570, "LocusValue_rs147068120")
addLocusValueAttributeObject(571, "LocusValue_rs9971729")
addLocusValueAttributeObject(572, "LocusValue_rs4911442")
addLocusValueAttributeObject(573, "LocusValue_rs6910861")
addLocusValueAttributeObject(574, "LocusValue_rs12543326")
addLocusValueAttributeObject(575, "LocusValue_rs10424065")
addLocusValueAttributeObject(576, "LocusValue_rs1978859")
addLocusValueAttributeObject(577, "LocusValue_rs6462562")
addLocusValueAttributeObject(578, "LocusValue_rs6020957")
addLocusValueAttributeObject(579, "LocusValue_rs2733832")
addLocusValueAttributeObject(580, "LocusValue_rs8039195")
addLocusValueAttributeObject(581, "LocusValue_rs2034128")
addLocusValueAttributeObject(582, "LocusValue_rs4353811")
addLocusValueAttributeObject(583, "LocusValue_rs7965082")
addLocusValueAttributeObject(584, "LocusValue_rs10265937")
addLocusValueAttributeObject(585, "LocusValue_rs12437560")
addLocusValueAttributeObject(586, "LocusValue_rs1019212")
addLocusValueAttributeObject(587, "LocusValue_rs805693")
addLocusValueAttributeObject(588, "LocusValue_rs6828137")
addLocusValueAttributeObject(589, "LocusValue_rs805694")
addLocusValueAttributeObject(590, "LocusValue_rs397723")
addLocusValueAttributeObject(591, "LocusValue_rs62330021")
addLocusValueAttributeObject(592, "LocusValue_rs1572037")
addLocusValueAttributeObject(593, "LocusValue_rs7219915")
addLocusValueAttributeObject(594, "LocusValue_rs112747614")
addLocusValueAttributeObject(595, "LocusValue_rs10237838")
addLocusValueAttributeObject(596, "LocusValue_rs138777265")
addLocusValueAttributeObject(597, "LocusValue_rs6918152")
addLocusValueAttributeObject(598, "LocusValue_rs3212369")
addLocusValueAttributeObject(599, "LocusValue_rs1005999")
addLocusValueAttributeObject(600, "LocusValue_rs1393350")
addLocusValueAttributeObject(601, "LocusValue_rs7176696")
addLocusValueAttributeObject(602, "LocusValue_rs4778241")
addLocusValueAttributeObject(603, "LocusValue_rs3940272")
addLocusValueAttributeObject(604, "LocusValue_rs2835621")
addLocusValueAttributeObject(605, "LocusValue_rs2034127")
addLocusValueAttributeObject(606, "LocusValue_rs9858909")
addLocusValueAttributeObject(607, "LocusValue_rs6020940")
addLocusValueAttributeObject(608, "LocusValue_rs2168809")
addLocusValueAttributeObject(609, "LocusValue_rs4433629")
addLocusValueAttributeObject(610, "LocusValue_rs16977002")
addLocusValueAttributeObject(611, "LocusValue_rs10843104")
addLocusValueAttributeObject(612, "LocusValue_rs3794604")
addLocusValueAttributeObject(613, "LocusValue_rs2854746")
addLocusValueAttributeObject(614, "LocusValue_rs10237488")
addLocusValueAttributeObject(615, "LocusValue_rs9971100")
addLocusValueAttributeObject(616, "LocusValue_rs2095645")
addLocusValueAttributeObject(617, "LocusValue_rs2385028")
addLocusValueAttributeObject(618, "LocusValue_rs6997494")
addLocusValueAttributeObject(619, "LocusValue_rs2422241")
addLocusValueAttributeObject(620, "LocusValue_rs6039272")
addLocusValueAttributeObject(621, "LocusValue_rs1105879")
addLocusValueAttributeObject(622, "LocusValue_rs4911414")
addLocusValueAttributeObject(623, "LocusValue_rs72928978")
addLocusValueAttributeObject(624, "LocusValue_rs73488486")
addLocusValueAttributeObject(625, "LocusValue_rs141318671")
addLocusValueAttributeObject(626, "LocusValue_rs4778211")
addLocusValueAttributeObject(627, "LocusValue_rs10237319")
addLocusValueAttributeObject(628, "LocusValue_rs4793389")
addLocusValueAttributeObject(629, "LocusValue_rs7183877")
addLocusValueAttributeObject(630, "LocusValue_rs12552712")
addLocusValueAttributeObject(631, "LocusValue_rs7628370")
addLocusValueAttributeObject(632, "LocusValue_rs1562005")
addLocusValueAttributeObject(633, "LocusValue_rs1015092")
addLocusValueAttributeObject(634, "LocusValue_rs7214306")
addLocusValueAttributeObject(635, "LocusValue_rs6056126")
addLocusValueAttributeObject(636, "LocusValue_rs11957757")
addLocusValueAttributeObject(637, "LocusValue_rs805722")
addLocusValueAttributeObject(638, "LocusValue_rs7277820")
addLocusValueAttributeObject(639, "LocusValue_rs12821256")
addLocusValueAttributeObject(640, "LocusValue_rs7552331")
addLocusValueAttributeObject(641, "LocusValue_rs17447439")
addLocusValueAttributeObject(642, "LocusValue_rs3935591")
addLocusValueAttributeObject(643, "LocusValue_rs3768056")
addLocusValueAttributeObject(644, "LocusValue_rs12913832")
addLocusValueAttributeObject(645, "LocusValue_rs7640340")
addLocusValueAttributeObject(646, "LocusValue_rs12155314")
addLocusValueAttributeObject(647, "LocusValue_rs9782955")
addLocusValueAttributeObject(648, "LocusValue_rs351385")
addLocusValueAttributeObject(649, "LocusValue_rs4790309")
addLocusValueAttributeObject(650, "LocusValue_rs937171")
addLocusValueAttributeObject(651, "LocusValue_rs4552364")
addLocusValueAttributeObject(652, "LocusValue_rs11191909")
addLocusValueAttributeObject(653, "LocusValue_rs728405")
addLocusValueAttributeObject(654, "LocusValue_rs1325127")
addLocusValueAttributeObject(655, "LocusValue_rs72777200")
addLocusValueAttributeObject(656, "LocusValue_rs2762462")
addLocusValueAttributeObject(657, "LocusValue_rs6749293")
addLocusValueAttributeObject(658, "LocusValue_rs7807181")
addLocusValueAttributeObject(659, "LocusValue_rs7966317")
addLocusValueAttributeObject(660, "LocusValue_rs2238289")
addLocusValueAttributeObject(661, "LocusValue_rs16891982")
addLocusValueAttributeObject(662, "LocusValue_rs2748901")
addLocusValueAttributeObject(663, "LocusValue_rs4053148")
addLocusValueAttributeObject(664, "LocusValue_rs116359091")
addLocusValueAttributeObject(665, "LocusValue_rs1129038")
addLocusValueAttributeObject(666, "LocusValue_rs7516150")
addLocusValueAttributeObject(667, "LocusValue_rs4648379")
addLocusValueAttributeObject(668, "LocusValue_rs13097965")
addLocusValueAttributeObject(669, "LocusValue_rs11237982")
addLocusValueAttributeObject(670, "LocusValue_rs2252893")
addLocusValueAttributeObject(671, "LocusValue_rs12906280")
addLocusValueAttributeObject(672, "LocusValue_rs11604811")
addLocusValueAttributeObject(673, "LocusValue_rs12335410")
addLocusValueAttributeObject(674, "LocusValue_rs6555969")
addLocusValueAttributeObject(675, "LocusValue_rs6478394")
addLocusValueAttributeObject(676, "LocusValue_rs2274107")
addLocusValueAttributeObject(677, "LocusValue_rs74409360")
addLocusValueAttributeObject(678, "LocusValue_rs10278187")
addLocusValueAttributeObject(679, "LocusValue_rs4633993")
addLocusValueAttributeObject(680, "LocusValue_rs2832438")
addLocusValueAttributeObject(681, "LocusValue_rs2894450")
addLocusValueAttributeObject(682, "LocusValue_rs875143")
addLocusValueAttributeObject(683, "LocusValue_rs916977")
addLocusValueAttributeObject(684, "LocusValue_rs341147")
addLocusValueAttributeObject(685, "LocusValue_rs1999527")
addLocusValueAttributeObject(686, "LocusValue_rs10234405")
addLocusValueAttributeObject(687, "LocusValue_rs2327101")
addLocusValueAttributeObject(688, "LocusValue_rs8028689")
addLocusValueAttributeObject(689, "LocusValue_rs717463")
addLocusValueAttributeObject(690, "LocusValue_rs8079498")
addLocusValueAttributeObject(691, "LocusValue_rs12593929")
addLocusValueAttributeObject(692, "LocusValue_rs12203592")
addLocusValueAttributeObject(693, "LocusValue_rs4521336")
addLocusValueAttributeObject(694, "LocusValue_rs1834640")
addLocusValueAttributeObject(695, "LocusValue_rs13098099")
addLocusValueAttributeObject(696, "LocusValue_rs975633")
addLocusValueAttributeObject(697, "LocusValue_rs13297008")
addLocusValueAttributeObject(698, "LocusValue_rs2240203")
addLocusValueAttributeObject(699, "LocusValue_rs3829241")
addLocusValueAttributeObject(700, "LocusValue_rs12694574")
addLocusValueAttributeObject(701, "LocusValue_rs2034129")
addLocusValueAttributeObject(702, "LocusValue_rs1800407")
addLocusValueAttributeObject(703, "LocusValue_rs348613")
addLocusValueAttributeObject(704, "LocusValue_rs7182710")
addLocusValueAttributeObject(705, "LocusValue_rs142317543")
addLocusValueAttributeObject(706, "LocusValue_rs7781059")
addLocusValueAttributeObject(707, "LocusValue_rs4778138")
addLocusValueAttributeObject(708, "LocusValue_rs1126809")
addLocusValueAttributeObject(709, "LocusValue_rs1408799")
addLocusValueAttributeObject(710, "LocusValue_rs1562006")
addLocusValueAttributeObject(711, "LocusValue_rs12452184")
addLocusValueAttributeObject(712, "LocusValue_rs10209564")
addLocusValueAttributeObject(713, "LocusValue_rs12913823")
addLocusValueAttributeObject(714, "LocusValue_rs11631797")
addLocusValueAttributeObject(715, "LocusValue_rs6944702")
addLocusValueAttributeObject(716, "LocusValue_rs6693258")
addLocusValueAttributeObject(717, "LocusValue_rs642742")
addLocusValueAttributeObject(718, "LocusValue_rs6795519")
addLocusValueAttributeObject(719, "LocusValue_rs6039266")
addLocusValueAttributeObject(720, "LocusValue_rs2070959")
addLocusValueAttributeObject(721, "LocusValue_rs6420484")
addLocusValueAttributeObject(722, "LocusValue_rs2835660")
addLocusValueAttributeObject(723, "LocusValue_rs12358982")
addLocusValueAttributeObject(724, "LocusValue_rs16977008")
addLocusValueAttributeObject(725, "LocusValue_rs1667394")
addLocusValueAttributeObject(726, "LocusValue_rs1426654")
addLocusValueAttributeObject(727, "LocusValue_rs1939697")
addLocusValueAttributeObject(728, "LocusValue_rs7170852")
addLocusValueAttributeObject(729, "LocusValue_rs121908120")
addLocusValueAttributeObject(730, "LocusValue_rs2327089")
addLocusValueAttributeObject(731, "LocusValue_rs911020")
addLocusValueAttributeObject(732, "LocusValue_rs6058017")
addLocusValueAttributeObject(733, "LocusValue_rs6462544")
addLocusValueAttributeObject(734, "LocusValue_rs2108166")
addLocusValueAttributeObject(735, "LocusValue_rs17252053")
addLocusValueAttributeObject(736, "LocusValue_rs9301973")
addLocusValueAttributeObject(737, "LocusValue_rs35264875")
addLocusValueAttributeObject(738, "LocusValue_rs9894429")
addLocusValueAttributeObject(739, "LocusValue_rs10485860")
addLocusValueAttributeObject(740, "LocusValue_rs1008591")
addLocusValueAttributeObject(741, "LocusValue_rs6056119")
addLocusValueAttributeObject(742, "LocusValue_rs3912104")
addLocusValueAttributeObject(743, "LocusValue_rs790464")
addLocusValueAttributeObject(744, "LocusValue_rs4778218")
addLocusValueAttributeObject(745, "LocusValue_rs1747677")
addLocusValueAttributeObject(746, "LocusValue_rs6056066")
addLocusValueAttributeObject(747, "LocusValue_rs12614022")
addLocusValueAttributeObject(748, "LocusValue_rs7799331")
addLocusValueAttributeObject(749, "LocusValue_rs1805007")
addLocusValueAttributeObject(750, "LocusValue_rs4648477")
addLocusValueAttributeObject(751, "LocusValue_rs4648478")
addLocusValueAttributeObject(752, "LocusValue_rs9692219")
getMandatoryAttributeFunction := func(profileVersion int)([]string, error){
if (profileVersion != 1){
return nil, errors.New("Trying to retrieve mandatory attributes for unknown profile version.")
}
mandatoryAttributesList := []string{mandatoryAttributeName}
return mandatoryAttributesList, nil
}
attributeObject_LocusIsPhased := AttributeObject{
ProfileVersions: []int{1},
AttributeIdentifier: attributeIdentifier,
AttributeName: attributeName,
GetIsRequired: getIsRequired_No,
GetMandatoryAttributes: getMandatoryAttributeFunction,
GetProfileTypes: getProfileTypes_Mate,
GetIsCanonical: getIsCanonical_Always,
CheckValueFunction: checkValueFunction_MateYesOrNo,
}
attributeObjectsList = append(attributeObjectsList, attributeObject_LocusIsPhased)
}
index := 10000
for _, rsID := range shareableRSIDsList{
rsIDString := helpers.ConvertInt64ToString(rsID)
locusValueAttributeName := "LocusValue_rs" + rsIDString
locusIsPhasedAttributeName := "LocusIsPhased_rs" + rsIDString
addLocusValueAttributeObject(index, locusValueAttributeName, locusIsPhasedAttributeName)
index += 1
addLocusIsPhasedAttributeObject(index, locusIsPhasedAttributeName, locusValueAttributeName)
index += 1
}
profileAttributeObjectsList = attributeObjectsList
return nil
}

View file

@ -10,9 +10,20 @@ import "seekia/internal/helpers"
import "testing"
import "strings"
func TestProfileFormat(t *testing.T){
err := profileFormat.InitializeProfileFormatVariables()
err := polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
err = profileFormat.InitializeProfileFormatVariables()
if (err != nil){
t.Fatalf("Failed to initialize profile format variables: " + err.Error())
}
@ -192,7 +203,10 @@ func TestProfileGeneticReferences(t *testing.T){
}
}
polygenicDiseases.InitializePolygenicDiseaseVariables()
err = polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil) {
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
polygenicDiseaseObjectsList, err := polygenicDiseases.GetPolygenicDiseaseObjectsList()
if (err != nil) {
@ -203,9 +217,7 @@ func TestProfileGeneticReferences(t *testing.T){
diseaseLociList := diseaseObject.LociList
for _, locusObject := range diseaseLociList{
locusRSID := locusObject.LocusRSID
for _, locusRSID := range diseaseLociList{
locusRSIDString := helpers.ConvertInt64ToString(locusRSID)
@ -218,7 +230,10 @@ func TestProfileGeneticReferences(t *testing.T){
}
}
traits.InitializeTraitVariables()
err = traits.InitializeTraitVariables()
if (err != nil) {
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
traitObjectsList, err := traits.GetTraitObjectsList()
if (err != nil){

View file

@ -1,58 +0,0 @@
// geneticPredictionModels contains genetic prediction neural network models for predicting genetic traits
// These are .gob encoded files of []float32 weights
// This package also contains prediction accuracy information for each model
// Prediction accuracy models describe information about how accurate the predictions made by the models are
// All of the files in this package are created by the Create Genetic Models utility.
// This utility is located in /utilities/createGeneticModels/createGeneticModels.go
package geneticPredictionModels
import _ "embed"
import "errors"
//Outputs:
// -bool: Model exists
// -[]byte
func GetGeneticPredictionModelBytes(traitName string)(bool, []byte){
switch traitName{
case "Eye Color":{
return true, predictionModel_EyeColor
}
case "Lactose Tolerance":{
return true, predictionModel_LactoseTolerance
}
}
return false, nil
}
//go:embed predictionModels/EyeColorModel.gob
var predictionModel_EyeColor []byte
//go:embed predictionModels/LactoseToleranceModel.gob
var predictionModel_LactoseTolerance []byte
// The files returned by this function are .gob encoded geneticPrediction.TraitPredictionAccuracyInfoMap objects
func GetPredictionModelTraitAccuracyInfoBytes(traitName string)([]byte, error){
switch traitName{
case "Eye Color":{
return predictionAccuracy_EyeColor, nil
}
case "Lactose Tolerance":{
return predictionAccuracy_LactoseTolerance, nil
}
}
return nil, errors.New("GetPredictionModelTraitAccuracyInfoFile called with unknown traitName: " + traitName)
}
//go:embed predictionModelAccuracies/EyeColorModelAccuracy.gob
var predictionAccuracy_EyeColor []byte
//go:embed predictionModelAccuracies/LactoseToleranceModelAccuracy.gob
var predictionAccuracy_LactoseTolerance []byte

View file

@ -1,48 +0,0 @@
package geneticPredictionModels_test
import "seekia/resources/geneticPredictionModels"
import "testing"
import "seekia/internal/genetics/geneticPrediction"
func TestGeneticPredictionModels(t *testing.T){
traitNamesList := []string{"Eye Color", "Lactose Tolerance"}
for _, traitName := range traitNamesList{
modelFound, modelBytes := geneticPredictionModels.GetGeneticPredictionModelBytes(traitName)
if (modelFound == false){
t.Fatalf("GetGeneticPredictionModelBytes failed to find model for trait: " + traitName)
}
_, err := geneticPrediction.DecodeBytesToNeuralNetworkObject(modelBytes)
if (err != nil){
t.Fatalf("DecodeBytesToNeuralNetworkObject failed: " + err.Error())
}
}
}
func TestGeneticPredictionModelAccuracies(t *testing.T){
traitNamesList := []string{"Eye Color", "Lactose Tolerance"}
for _, traitName := range traitNamesList{
accuracyInfoBytes, err := geneticPredictionModels.GetPredictionModelTraitAccuracyInfoBytes(traitName)
if (err != nil){
t.Fatalf("GetGeneticPredictionModelBytes failed: " + err.Error())
}
_, err = geneticPrediction.DecodeBytesToTraitPredictionAccuracyInfoMap(accuracyInfoBytes)
if (err != nil){
t.Fatalf("DecodeBytesToTraitPredictionAccuracyInfoMap failed: " + err.Error())
}
}
}

View file

@ -0,0 +1,8 @@
// attributeLoci provides loci associated with various bodily attributes
// For example, this package stores all loci associated with the brain
// We can then use this same set of loci to predict all attributes pertaining to the brain, such as autism, depression, anxiety, etc.
// We need this package because we can't have packages that import from polygenicDiseases->traits and traits->polygenicDiseases
package attributeLoci

View file

@ -0,0 +1,605 @@
package attributeLoci
import "maps"
// Outputs:
// -map[int64]map[string]string
// -Map Structure: rsID -> map[ReferenceName]ReferenceLink
func GetAutismLoci()map[int64]map[string]string{
// Map Structure: rsID -> (map[Reference Name]Reference Link)
locusReferencesMap := make(map[int64]map[string]string)
locus1_ReferencesMap := make(map[string]string)
locus1_ReferencesMap["SNPedia.com - rs10513025"] = "https://www.snpedia.com/index.php/Rs10513025"
locusReferencesMap[10513025] = locus1_ReferencesMap
locus2_ReferencesMap := make(map[string]string)
locus2_ReferencesMap["SNPedia.com - rs2710102"] = "https://www.snpedia.com/index.php/Rs2710102"
locusReferencesMap[2710102] = locus2_ReferencesMap
locus3_ReferencesMap := make(map[string]string)
locus3_ReferencesMap["SNPedia.com - rs7794745"] = "https://www.snpedia.com/index.php/Rs7794745"
locusReferencesMap[7794745] = locus3_ReferencesMap
locus4_ReferencesMap := make(map[string]string)
locus4_ReferencesMap["SNPedia.com - rs1858830"] = "https://www.snpedia.com/index.php/Rs1858830"
locusReferencesMap[1858830] = locus4_ReferencesMap
locus5_ReferencesMap := make(map[string]string)
locus5_ReferencesMap["SNPedia.com - rs1322784"] = "https://www.snpedia.com/index.php/Rs1322784"
locusReferencesMap[1322784] = locus5_ReferencesMap
locus6_ReferencesMap := make(map[string]string)
locus6_ReferencesMap["SNPedia.com - rs1804197"] = "https://www.snpedia.com/index.php/Rs1804197"
locusReferencesMap[1804197] = locus6_ReferencesMap
locus7_ReferencesMap := make(map[string]string)
locus7_ReferencesMap["SNPedia.com - rs265981"] = "https://www.snpedia.com/index.php/Rs265981"
locusReferencesMap[265981] = locus7_ReferencesMap
locus8_ReferencesMap := make(map[string]string)
locus8_ReferencesMap["SNPedia.com - rs4532"] = "https://www.snpedia.com/index.php/Rs4532"
locusReferencesMap[4532] = locus8_ReferencesMap
locus9_ReferencesMap := make(map[string]string)
locus9_ReferencesMap["SNPedia.com - rs686"] = "https://www.snpedia.com/index.php/Rs686"
locusReferencesMap[686] = locus9_ReferencesMap
locus10_ReferencesMap := make(map[string]string)
locus10_ReferencesMap["SNPedia.com - rs6766410"] = "https://www.snpedia.com/index.php/Rs6766410"
locusReferencesMap[6766410] = locus10_ReferencesMap
locus11_ReferencesMap := make(map[string]string)
locus11_ReferencesMap["SNPedia.com - rs6807362"] = "https://www.snpedia.com/index.php/Rs6807362"
locusReferencesMap[6807362] = locus11_ReferencesMap
locus12_ReferencesMap := make(map[string]string)
locus12_ReferencesMap["SNPedia.com - rs1143674"] = "https://www.snpedia.com/index.php/Rs1143674"
locusReferencesMap[1143674] = locus12_ReferencesMap
locus13_ReferencesMap := make(map[string]string)
locus13_ReferencesMap["SNPedia.com - rs2745557"] = "https://www.snpedia.com/index.php/Rs2745557"
locusReferencesMap[2745557] = locus13_ReferencesMap
locus14_ReferencesMap := make(map[string]string)
locus14_ReferencesMap["SNPedia.com - rs2217262"] = "https://www.snpedia.com/index.php/Rs2217262"
locusReferencesMap[2217262] = locus14_ReferencesMap
locus15_ReferencesMap := make(map[string]string)
locus15_ReferencesMap["SNPedia.com - rs373126732"] = "https://www.snpedia.com/index.php/Rs373126732"
locusReferencesMap[373126732] = locus15_ReferencesMap
locus16_ReferencesMap := make(map[string]string)
locus16_ReferencesMap["SNPedia.com - rs184718561"] = "https://www.snpedia.com/index.php/Rs184718561"
locusReferencesMap[184718561] = locus16_ReferencesMap
locus17_ReferencesMap := make(map[string]string)
locus17_ReferencesMap["SNPedia.com - rs1445442"] = "https://www.snpedia.com/index.php/Rs1445442"
locusReferencesMap[1445442] = locus17_ReferencesMap
locus18_ReferencesMap := make(map[string]string)
locus18_ReferencesMap["SNPedia.com - rs2421826"] = "https://www.snpedia.com/index.php/Rs2421826"
locusReferencesMap[2421826] = locus18_ReferencesMap
locus19_ReferencesMap := make(map[string]string)
locus19_ReferencesMap["SNPedia.com - rs1358054"] = "https://www.snpedia.com/index.php/Rs1358054"
locusReferencesMap[1358054] = locus19_ReferencesMap
locus20_ReferencesMap := make(map[string]string)
locus20_ReferencesMap["SNPedia.com - rs722628"] = "https://www.snpedia.com/index.php/Rs722628"
locusReferencesMap[722628] = locus20_ReferencesMap
locus21_ReferencesMap := make(map[string]string)
locus21_ReferencesMap["SNPedia.com - rs536861"] = "https://www.snpedia.com/index.php/Rs536861"
locusReferencesMap[536861] = locus21_ReferencesMap
locus22_ReferencesMap := make(map[string]string)
locus22_ReferencesMap["SNPedia.com - rs757972971"] = "https://www.snpedia.com/index.php/Rs757972971"
locusReferencesMap[757972971] = locus22_ReferencesMap
referencesMap_LocusList1 := make(map[string]string)
referencesMap_LocusList1["Understanding the impact of SNPs associated with autism spectrum disorder on biological pathways in the human fetal and adult cortex"] = "https://www.nature.com/articles/s41598-021-95447-z"
lociList1 := []int64{
13217619,
115329265,
116137698,
141342723,
75782365,
151267808,
7746199,
114115252,
4298967,
1782810,
6921919,
9467711,
115707823,
116633139,
115123779,
116326873,
9834970,
144762289,
9348739,
4481150,
12129573,
116408368,
11191419,
115242751,
116385615,
114882497,
114867672,
12658451,
202906,
13212562,
7085104,
1702294,
114276265,
116427960,
59574136,
114041423,
7531118,
114964506,
111639056,
6939532,
6940116,
116663187,
114904464,
145547914,
9269271,
114963521,
140502984,
61867293,
115035678,
9274390,
11688767,
78110044,
150680405,
10883832,
7752195,
115497191,
116676919,
11191582,
115344853,
144911693,
71395455,
5758265,
2007044,
149979052,
115682897,
3001723,
1024582,
115625073,
9273177,
61472021,
12668848,
184153866,
115558405,
150430679,
115687605,
35324223,
9274299,
138984909,
145076523,
55661361,
911186,
144304366,
10149470,
144660248,
13218591,
114455101,
185717927,
144649399,
114086406,
11682175,
142972412,
138748649,
7405404,
11693528,
12958048,
35225200,
114950038,
140865314,
4129585,
12887734,
36057735,
115052633,
186129480,
2507989,
2021722,
140505938,
2388334,
3617,
114274203,
281768,
115937317,
144018888,
2535629,
4906364,
180778602,
707939,
8084351,
80318442,
186229361,
9461856,
113397282,
28681284,
113205291,
2851447,
4380187,
115960997,
1793889,
142790902,
111312615,
144532965,
75968099,
115661163,
1518367,
193267147,
41293179,
200986,
34787248,
140364877,
13240464,
1625579,
4702,
2514218,
778353,
325506,
182908437,
149721896,
6434928,
4713071,
11753207,
191843781,
116182620,
2760981,
116067082,
142601889,
147976543,
116254153,
8054556,
114204022,
115165987,
9636107,
41563,
35828350,
764284,
115325719,
7193263,
149915948,
17843707,
79879286,
631399,
732381,
1150688,
189600472,
3798869,
5757717,
145501595,
4642619,
117616320,
12704290,
2176546,
149787317,
11570190,
4391122,
7071123,
12712388,
4307059,
369637,
114291394,
11740474,
12925872,
116460775,
114838832,
10791097,
35610290,
114812317,
9469174,
7801375,
114508985,
6704768,
4580973,
147875011,
7893279,
12966547,
9922678,
111294930,
6047287,
34215985,
2693698,
12826178,
2237234,
11210892,
67756423,
9787523,
10108980,
2057884,
1498232,
8042374,
142520578,
114771361,
114810457,
17194490,
145470632,
36063234,
2332700,
1615350,
3735025,
115283957,
75059851,
1730054,
116593970,
4523957,
169738,
35346733,
12954356,
7907645,
2910032,
9270074,
1899546,
6071524,
11874716,
72761442,
3132556,
116139966,
139547629,
28724212,
6855246,
72934570,
147793969,
115487448,
4619651,
7521492,
2103655,
880090,
1806153,
11787216,
115915654,
11223651,
62378245,
8009147,
7191183,
77502336,
3849046,
1131275,
61747867,
116047537,
41293330,
61789073,
7914558,
10043984,
10514301,
117956829,
4647903,
4916723,
28669119,
35774874,
4244354,
1452075,
56223946,
2434529,
115641444,
149998036,
184123737,
10994359,
9360557,
80256351,
6125656,
247910,
3812984,
915057,
17659437,
11641947,
139099016,
72687362,
57709857,
11210195,
3020736,
12592967,
5995756,
385492,
115443066,
9371601,
59979824,
6694545,
1484144,
832190,
9267057,
4309187,
149544854,
116502302,
191269336,
1006737,
10265001,
6969410,
1080500,
171748,
139480376,
17292804,
174592,
1620977,
184538485,
191239160,
301798,
10211550,
10994397,
9677504,
144158419,
2098651,
8321,
11231640,
77135925,
12474906,
2300861,
2391769,
10520163,
9607782,
55648125,
10099100,
16854048,
35131895,
1977199,
145607970,
115569272,
116552815,
6803008,
35998080,
10791111,
2944591,
1353545,
115437294,
133047,
9274657,
11191580,
11191454,
7618871,
10745841,
61882743,
116755193,
142462188,
7200826,
27419,
2414718,
2842198,
12552,
395138,
760648,
1002656,
2898883,
13072940,
12443170,
114441450,
146201420,
184981897,
138850297,
8032315,
7184114,
115136442,
2767713,
2828478,
9879311,
114142645,
111977918,
7819570,
12522290,
112209031,
10491964,
11658257,
62526783,
6471814,
11866581,
12894153,
2391734,
2522831,
2003490,
301799,
1226412,
1950829,
8453,
926938,
6537825,
111931861,
115963308,
149961934,
61847307,
146827975,
1339227,
36350,
7432375,
9656169,
28758902,
427691,
2293751,
182087722,
73416724,
61884307,
188190243,
41294271,
114830752,
7004633,
7785663,
8066384,
188099135,
4730387,
11887562,
2801578,
4242470,
746839,
3827735,
11582563,
11102807,
7511633,
11102800,
11585926,
6661053,
11589568,
4141463,
201910565,
71190156,
353547,
880446,
2115780,
114277634,
140849564,
76994193,
114875775,
7122181,
221902,
12576775,
10503253,
2799573,
4495234,
4526442,
4682973,
12898460,
2047568,
2910032,
1501361,
}
for _, rsID := range lociList1{
existingMap, exists := locusReferencesMap[rsID]
if (exists == false){
locusReferencesMap[rsID] = maps.Clone(referencesMap_LocusList1)
} else {
// We merge the maps
for key, value := range referencesMap_LocusList1{
existingMap[key] = value
}
locusReferencesMap[rsID] = existingMap
}
}
return locusReferencesMap
}

View file

@ -185,7 +185,10 @@ func TestGeneticReferences(t *testing.T){
}
}
polygenicDiseases.InitializePolygenicDiseaseVariables()
err = polygenicDiseases.InitializePolygenicDiseaseVariables()
if (err != nil){
t.Fatalf("InitializePolygenicDiseaseVariables failed: " + err.Error())
}
polygenicDiseaseObjectsList, err := polygenicDiseases.GetPolygenicDiseaseObjectsList()
if (err != nil) {
@ -199,6 +202,7 @@ func TestGeneticReferences(t *testing.T){
diseaseName := diseaseObject.DiseaseName
diseaseDescription := diseaseObject.DiseaseDescription
diseaseEffectedSex := diseaseObject.EffectedSex
diseaseLocusReferencesMap := diseaseObject.LocusReferencesMap
diseaseLociList := diseaseObject.LociList
diseaseReferencesMap := diseaseObject.References
@ -218,85 +222,40 @@ func TestGeneticReferences(t *testing.T){
t.Fatalf("PolygenicDisease effected sex is invalid: " + diseaseEffectedSex)
}
for rsID, referencesMap := range diseaseLocusReferencesMap{
containsItem := slices.Contains(diseaseLociList, rsID)
if (containsItem == false){
t.Fatalf("Polygenic disease diseaseLocusReferencesMap contains disease locus that is not inside of the disease's loci list.")
}
allRSIDsMap[rsID] = struct{}{}
referencesAreValid := verifyReferencesMap(referencesMap)
if (referencesAreValid == false){
t.Fatalf("PolygenicDisease references map is invalid for disease locus.")
}
}
containsDuplicates, _ := helpers.CheckIfListContainsDuplicates(diseaseLociList)
if (containsDuplicates == true){
t.Fatalf("Polygenic disease object contains diseaseLociList with duplicate rsIDs.")
}
if (len(diseaseLocusReferencesMap) > len(diseaseLociList)){
t.Fatalf("Polygenic disease contains locus references map that is longer than the diseaseLociList")
}
referencesAreValid := verifyReferencesMap(diseaseReferencesMap)
if (referencesAreValid == false){
t.Fatalf("PolygenicDisease references map is invalid for disease: " + diseaseName)
}
// We use this map to make sure each disease locus references a unique rsid
allPolygenicDiseaseRSIDsMap := make(map[int64]struct{})
for _, locusObject := range diseaseLociList{
locusIdentifier := locusObject.LocusIdentifier
locusRSID := locusObject.LocusRSID
riskWeightsMap := locusObject.RiskWeightsMap
oddsRatiosMap := locusObject.OddsRatiosMap
minimumWeight := locusObject.MinimumRiskWeight
maximumWeight := locusObject.MaximumRiskWeight
allRSIDsMap[locusRSID] = struct{}{}
identifierIsValid := verifyIdentifier(locusIdentifier)
if (identifierIsValid == false){
t.Fatalf(diseaseName + " Invalid locus identifier found: " + locusIdentifier)
}
_, exists := allIdentifiersMap[locusIdentifier]
if (exists == true){
t.Fatalf(diseaseName + " Duplicate locus identifier found: " + locusIdentifier)
}
allIdentifiersMap[locusIdentifier] = struct{}{}
_, exists = allPolygenicDiseaseRSIDsMap[locusRSID]
if (exists == true){
rsidString := helpers.ConvertInt64ToString(locusRSID)
t.Fatalf(diseaseName + " RSID Collision found: " + rsidString)
}
allPolygenicDiseaseRSIDsMap[locusRSID] = struct{}{}
if (len(riskWeightsMap) == 0){
t.Fatalf("Empty base weights map found: " + locusIdentifier)
}
trueMinimumWeight := 100000
trueMaximumWeight := -100000
for basePair, basePairWeight := range riskWeightsMap{
isValid := verifyBasePair(basePair)
if (isValid == false){
t.Fatalf("Base pair weights map contains invalid base pair: " + locusIdentifier)
}
if (basePairWeight < trueMinimumWeight){
trueMinimumWeight = basePairWeight
}
if (basePairWeight > trueMaximumWeight){
trueMaximumWeight = basePairWeight
}
}
if (trueMinimumWeight != minimumWeight){
t.Fatalf(diseaseName + ": Invalid minimum base pair weight found: " + locusIdentifier)
}
if (trueMaximumWeight != maximumWeight){
t.Fatalf(diseaseName + ": Invalid maximum base pair weight found: " + locusIdentifier)
}
for basePair, _ := range oddsRatiosMap{
isValid := verifyBasePair(basePair)
if (isValid == false){
t.Fatalf("Odds ratio weights map contains invalid base pair: " + locusIdentifier)
}
}
//TODO: Make sure that duplicate base pairs have same weight, odds ratios and probabilities
}
}
traits.InitializeTraitVariables()
err = traits.InitializeTraitVariables()
if (err != nil){
t.Fatalf("InitializeTraitVariables failed: " + err.Error())
}
traitObjectsList, err := traits.GetTraitObjectsList()
if (err != nil){
@ -499,12 +458,10 @@ func TestGeneticReferences(t *testing.T){
//
// We only care about alias collisions within each company.
// Multiple companies can refer to the same location with the same alias.
//
type companyAliasStruct struct{
geneticsCompany locusMetadata.GeneticsCompany
locusAlias string
}
@ -519,6 +476,8 @@ func TestGeneticReferences(t *testing.T){
rsidsList := locusMetadataObject.RSIDsList
locusChromosome := locusMetadataObject.Chromosome
locusPosition := locusMetadataObject.Position
geneInfoIsKnown := locusMetadataObject.GeneInfoIsKnown
geneExists := locusMetadataObject.GeneExists
geneNamesList := locusMetadataObject.GeneNamesList
locusCompanyAliasesMap := locusMetadataObject.CompanyAliases
referencesMap := locusMetadataObject.References
@ -541,8 +500,8 @@ func TestGeneticReferences(t *testing.T){
_, exists := locusMetadataRSIDsMap[rsID]
if (exists == true){
RSIDString := helpers.ConvertInt64ToString(rsID)
t.Fatalf("locusMetadataObjectsList contains duplicate RSID: " + RSIDString)
rsidString := helpers.ConvertInt64ToString(rsID)
t.Fatalf("locusMetadataObjectsList contains duplicate RSID: " + rsidString)
}
locusMetadataRSIDsMap[rsID] = struct{}{}
@ -580,7 +539,12 @@ func TestGeneticReferences(t *testing.T){
locusPositionsMap[locusPositionObject] = struct{}{}
if (len(geneNamesList) != 0){
if (geneInfoIsKnown == true && geneExists == true){
if (len(geneNamesList) == 0){
t.Fatalf("locusMetadataObjectsList contains locus with known gene and empty geneNamesList.")
}
for _, geneName := range geneNamesList{
if (geneName == ""){
t.Fatalf("locusMetadataObjectsList contains locus with empty geneName in geneNamesList.")
@ -593,7 +557,6 @@ func TestGeneticReferences(t *testing.T){
for _, locusCompanyAlias := range companyAliasesList{
companyAliasObject := companyAliasStruct{
geneticsCompany: companyObject,
locusAlias: locusCompanyAlias,
}
@ -613,6 +576,8 @@ func TestGeneticReferences(t *testing.T){
}
}
//TODO: Check to make sure that there are no identical company aliases for different loci
missingLociList := make([]int64, 0)
for rsID, _ := range allRSIDsMap{

View file

@ -1,182 +0,0 @@
[
{
"RSIDsList": [
17646946
],
"Chromosome": 1,
"Position": 152090291,
"GeneNamesList": [
"TCHHL1"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs17646946": "https://www.snpedia.com/index.php/Rs17646946"
}
},
{
"RSIDsList": [
11803731
],
"Chromosome": 1,
"Position": 152110849,
"GeneNamesList": [
"TCHH"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs11803731": "https://www.snpedia.com/index.php/Rs11803731"
}
},
{
"RSIDsList": [
4648379
],
"Chromosome": 1,
"Position": 3261516,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - Appearance": "https://www.snpedia.com/index.php/Appearance"
}
},
{
"RSIDsList": [
1999527
],
"Chromosome": 1,
"Position": 3256108,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7516150
],
"Chromosome": 1,
"Position": 3253889,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7552331
],
"Chromosome": 1,
"Position": 3253941,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
9782955
],
"Chromosome": 1,
"Position": 236039877,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
3768056
],
"Chromosome": 1,
"Position": 235907825,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
351385
],
"Chromosome": 1,
"Position": 212421629,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1572037
],
"Chromosome": 1,
"Position": 3254369,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6693258,
56426910
],
"Chromosome": 1,
"Position": 9106285,
"GeneNamesList": [
"GPR157"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4648477
],
"Chromosome": 1,
"Position": 3335411,
"GeneNamesList": [
"PRDM16"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4648478,
56579652,
58636362
],
"Chromosome": 1,
"Position": 3335443,
"GeneNamesList": [
"PRDM16"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2385028,
35558782,
4660119,
4428879
],
"Chromosome": 1,
"Position": 235872505,
"GeneNamesList": [
"LYST"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,97 +0,0 @@
[
{
"RSIDsList": [
2274107
],
"Chromosome": 10,
"Position": 105838703,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1747677
],
"Chromosome": 10,
"Position": 105815241,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
805722
],
"Chromosome": 10,
"Position": 105810400,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
805693
],
"Chromosome": 10,
"Position": 105815324,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12358982
],
"Chromosome": 10,
"Position": 104094571,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
805694
],
"Chromosome": 10,
"Position": 104055696,
"GeneNamesList": [
"COL17A1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
11191909
],
"Chromosome": 10,
"Position": 104053243,
"GeneNamesList": [
"COL17A1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
9971100,
10883964
],
"Chromosome": 10,
"Position": 104066661,
"GeneNamesList": [
"COL17A1"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,280 +0,0 @@
[
{
"RSIDsList": [
4987945,
2227924
],
"Chromosome": 11,
"Position": 108251865,
"GeneNamesList": [
"ATM"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs4987945": "https://www.snpedia.com/index.php/Rs4987945"
}
},
{
"RSIDsList": [
3218695
],
"Chromosome": 11,
"Position": 108259051,
"GeneNamesList": [
"ATM"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs3218695": "https://www.snpedia.com/index.php/Rs3218695"
}
},
{
"RSIDsList": [
3218707
],
"Chromosome": 11,
"Position": 108244000,
"GeneNamesList": [
"ATM"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs3218707": "https://www.snpedia.com/index.php/Rs3218707"
}
},
{
"RSIDsList": [
334,
77121243
],
"Chromosome": 11,
"Position": 5227002,
"GeneNamesList": [
"HBB"
],
"CompanyAliases": {
"1": [
"i3003137"
]
},
"References": {
"SNPedia.com - rs334": "https://www.snpedia.com/index.php/Rs334"
}
},
{
"RSIDsList": [
1801673
],
"Chromosome": 11,
"Position": 108304736,
"GeneNamesList": [
"ATM"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs1801673": "https://www.snpedia.com/index.php/Rs1801673"
}
},
{
"RSIDsList": [
1800056
],
"Chromosome": 11,
"Position": 108267276,
"GeneNamesList": [
"ATM"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs1800056": "https://www.snpedia.com/index.php/Rs1800056"
}
},
{
"RSIDsList": [
1800057
],
"Chromosome": 11,
"Position": 108272729,
"GeneNamesList": [
"ATM"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs1800057": "https://www.snpedia.com/index.php/Rs1800057"
}
},
{
"RSIDsList": [
4986761
],
"Chromosome": 11,
"Position": 108254034,
"GeneNamesList": [
"ATM"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs4986761": "https://www.snpedia.com/index.php/Rs4986761"
}
},
{
"RSIDsList": [
3092856
],
"Chromosome": 11,
"Position": 108289005,
"GeneNamesList": [
"ATM"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs3092856": "https://www.snpedia.com/index.php/Rs3092856"
}
},
{
"RSIDsList": [
1800058
],
"Chromosome": 11,
"Position": 108289623,
"GeneNamesList": [
"ATM"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs1800058": "https://www.snpedia.com/index.php/Rs1800058"
}
},
{
"RSIDsList": [
11237982
],
"Chromosome": 11,
"Position": 79441694,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1939707
],
"Chromosome": 11,
"Position": 100102098,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1042602
],
"Chromosome": 11,
"Position": 88911696,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1393350
],
"Chromosome": 11,
"Position": 89011046,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1126809
],
"Chromosome": 11,
"Position": 89017961,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
11604811
],
"Chromosome": 11,
"Position": 72389984,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
3829241
],
"Chromosome": 11,
"Position": 68855363,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
35264875
],
"Chromosome": 11,
"Position": 68846399,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1939697
],
"Chromosome": 11,
"Position": 100091693,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1800422
],
"Chromosome": 11,
"Position": 89284793,
"GeneNamesList": [
"TYR"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
72928978
],
"Chromosome": 11,
"Position": 69063896,
"GeneNamesList": [
"TPCN2"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,137 +0,0 @@
[
{
"RSIDsList": [
34330
],
"Chromosome": 12,
"Position": 12717761,
"GeneNamesList": [
"CDKN1B",
"GPR19"
],
"CompanyAliases": {},
"References": {
"SNPedia - rs34330": "https://www.snpedia.com/index.php/Rs34330"
}
},
{
"RSIDsList": [
17252053
],
"Chromosome": 12,
"Position": 85727948,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1887276
],
"Chromosome": 12,
"Position": 100797485,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4433629
],
"Chromosome": 12,
"Position": 90341455,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10843104
],
"Chromosome": 12,
"Position": 28276626,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12821256
],
"Chromosome": 12,
"Position": 89328335,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7965082
],
"Chromosome": 12,
"Position": 100800193,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
9971729
],
"Chromosome": 12,
"Position": 23979791,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
642742
],
"Chromosome": 12,
"Position": 89299746,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7966317
],
"Chromosome": 12,
"Position": 100795311,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
790464
],
"Chromosome": 12,
"Position": 92174057,
"GeneNamesList": [
"BTG1-DT"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,162 +0,0 @@
[
{
"RSIDsList": [
11571746
],
"Chromosome": 13,
"Position": 32370971,
"GeneNamesList": [
"BRCA2"
],
"CompanyAliases": {
"1": [
"i5009299"
]
},
"References": {
"SNPedia.com - rs11571746": "https://www.snpedia.com/index.php/Rs11571746"
}
},
{
"RSIDsList": [
11571747
],
"Chromosome": 13,
"Position": 32371035,
"GeneNamesList": [
"BRCA2"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs11571747": "https://www.snpedia.com/index.php/Rs11571747"
}
},
{
"RSIDsList": [
766173
],
"Chromosome": 13,
"Position": 32332343,
"GeneNamesList": [
"BRCA2"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs766173": "https://www.snpedia.com/index.php/Rs766173"
}
},
{
"RSIDsList": [
1801426
],
"Chromosome": 13,
"Position": 32398747,
"GeneNamesList": [
"BRCA2"
],
"CompanyAliases": {
"1": [
"i5009256"
]
},
"References": {
"SNPedia.com - rs1801426": "https://www.snpedia.com/index.php/Rs1801426"
}
},
{
"RSIDsList": [
4987117
],
"Chromosome": 13,
"Position": 32340099,
"GeneNamesList": [
"BRCA2"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs4987117": "https://www.snpedia.com/index.php/Rs4987117"
}
},
{
"RSIDsList": [
1799954
],
"Chromosome": 13,
"Position": 32340455,
"GeneNamesList": [
"BRCA2"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs1799954": "https://www.snpedia.com/index.php/Rs1799954"
}
},
{
"RSIDsList": [
144848
],
"Chromosome": 13,
"Position": 32332592,
"GeneNamesList": [
"BRCA2"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs144848": "https://www.snpedia.com/index.php/Rs144848"
}
},
{
"RSIDsList": [
4987047
],
"Chromosome": 13,
"Position": 32379392,
"GeneNamesList": [
"BRCA2"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs4987047": "https://www.snpedia.com/index.php/Rs4987047"
}
},
{
"RSIDsList": [
11571833
],
"Chromosome": 13,
"Position": 32398489,
"GeneNamesList": [
"BRCA2"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs11571833": "https://www.snpedia.com/index.php/Rs11571833"
}
},
{
"RSIDsList": [
2095645
],
"Chromosome": 13,
"Position": 74178399,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
9301973,
61272261,
17254025
],
"Chromosome": 13,
"Position": 94537147,
"GeneNamesList": [
"DCT"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,36 +0,0 @@
[
{
"RSIDsList": [
12896399
],
"Chromosome": 14,
"Position": 92773663,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
17184180
],
"Chromosome": 14,
"Position": 92780387,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
138777265
],
"Chromosome": 14,
"Position": 68769419,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,447 +0,0 @@
[
{
"RSIDsList": [
7183877
],
"Chromosome": 15,
"Position": 28365733,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1800407
],
"Chromosome": 15,
"Position": 28230318,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1129038
],
"Chromosome": 15,
"Position": 28356859,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7495174
],
"Chromosome": 15,
"Position": 28344238,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7174027
],
"Chromosome": 15,
"Position": 28328765,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1800414
],
"Chromosome": 15,
"Position": 28197037,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2240203
],
"Chromosome": 15,
"Position": 28494202,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4778218
],
"Chromosome": 15,
"Position": 28211758,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4778211
],
"Chromosome": 15,
"Position": 28199305,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
728405
],
"Chromosome": 15,
"Position": 28199853,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
8028689
],
"Chromosome": 15,
"Position": 28488888,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12906280
],
"Chromosome": 15,
"Position": 30265887,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
3935591
],
"Chromosome": 15,
"Position": 28374012,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1667394
],
"Chromosome": 15,
"Position": 28530182,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1800401
],
"Chromosome": 15,
"Position": 28260053,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12913823
],
"Chromosome": 15,
"Position": 50509591,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1426654
],
"Chromosome": 15,
"Position": 48426484,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12913832
],
"Chromosome": 15,
"Position": 28365618,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12593929
],
"Chromosome": 15,
"Position": 28359258,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
916977
],
"Chromosome": 15,
"Position": 28513364,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
11636232
],
"Chromosome": 15,
"Position": 28386626,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4778241
],
"Chromosome": 15,
"Position": 28338713,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
8039195
],
"Chromosome": 15,
"Position": 28516084,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
3794604
],
"Chromosome": 15,
"Position": 28272065,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
989869
],
"Chromosome": 15,
"Position": 28006306,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1834640
],
"Chromosome": 15,
"Position": 48392165,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7170852
],
"Chromosome": 15,
"Position": 28427986,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4778138
],
"Chromosome": 15,
"Position": 28335820,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
784416
],
"Chromosome": 15,
"Position": 49012925,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7176696
],
"Chromosome": 15,
"Position": 49073903,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
3940272
],
"Chromosome": 15,
"Position": 28468723,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2238289
],
"Chromosome": 15,
"Position": 28453215,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
937171
],
"Chromosome": 15,
"Position": 50194749,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
11631797
],
"Chromosome": 15,
"Position": 28502279,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12437560
],
"Chromosome": 15,
"Position": 61832507,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
11855019,
59065625
],
"Chromosome": 15,
"Position": 28090674,
"GeneNamesList": [
"OCA2"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7182710,
17466298,
61298156
],
"Chromosome": 15,
"Position": 48812737,
"GeneNamesList": [
"CEP152"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,50 +0,0 @@
[
{
"RSIDsList": [
1805007
],
"Chromosome": 16,
"Position": 89986117,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1805008
],
"Chromosome": 16,
"Position": 89986144,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
3212369
],
"Chromosome": 16,
"Position": 89986760,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
3212368
],
"Chromosome": 16,
"Position": 89920224,
"GeneNamesList": [
"MC1R"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,214 +0,0 @@
[
{
"RSIDsList": [
1799966
],
"Chromosome": 17,
"Position": 43071077,
"GeneNamesList": [
"BRCA1"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs1799966": "https://www.snpedia.com/index.php/Rs1799966"
}
},
{
"RSIDsList": [
1799950
],
"Chromosome": 17,
"Position": 43094464,
"GeneNamesList": [
"BRCA1"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs1799950": "https://www.snpedia.com/index.php/Rs1799950"
}
},
{
"RSIDsList": [
2227945
],
"Chromosome": 17,
"Position": 43092113,
"GeneNamesList": [
"BRCA1"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs2227945": "https://www.snpedia.com/index.php/Rs2227945"
}
},
{
"RSIDsList": [
16942
],
"Chromosome": 17,
"Position": 43091983,
"GeneNamesList": [
"BRCA1"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs16942": "https://www.snpedia.com/index.php/Rs16942"
}
},
{
"RSIDsList": [
4986850
],
"Chromosome": 17,
"Position": 43093454,
"GeneNamesList": [
"BRCA1"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs4986850": "https://www.snpedia.com/index.php/Rs4986850"
}
},
{
"RSIDsList": [
9894429
],
"Chromosome": 17,
"Position": 79596811,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12452184
],
"Chromosome": 17,
"Position": 79664426,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
16977009
],
"Chromosome": 17,
"Position": 69916524,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7219915
],
"Chromosome": 17,
"Position": 79591813,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
8079498
],
"Chromosome": 17,
"Position": 69919452,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
3809761
],
"Chromosome": 17,
"Position": 67497367,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
16977008
],
"Chromosome": 17,
"Position": 69916480,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
16977002
],
"Chromosome": 17,
"Position": 71919192,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6420484,
59590586,
17859003,
17846019
],
"Chromosome": 17,
"Position": 81645371,
"GeneNamesList": [
"TSPAN10"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4793389
],
"Chromosome": 17,
"Position": 71921776,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4790309,
58087488
],
"Chromosome": 17,
"Position": 2063595,
"GeneNamesList": [
"HIC1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7214306
],
"Chromosome": 17,
"Position": 71925130,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,62 +0,0 @@
[
{
"RSIDsList": [
1008591
],
"Chromosome": 19,
"Position": 46730614,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1019212,
58273978,
17660257
],
"Chromosome": 19,
"Position": 46225962,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
73488486
],
"Chromosome": 19,
"Position": 7516739,
"GeneNamesList": [
"ZNF358"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10424065
],
"Chromosome": 19,
"Position": 3545024,
"GeneNamesList": [
"MFSD12"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
142317543
],
"Chromosome": 19,
"Position": 3547687,
"GeneNamesList": [
"MFSD12"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,276 +0,0 @@
[
{
"RSIDsList": [
182549
],
"Chromosome": 2,
"Position": 135859184,
"GeneNamesList": [
"MCM6"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs182549": "https://www.snpedia.com/index.php/Rs182549"
}
},
{
"RSIDsList": [
1045485
],
"Chromosome": 2,
"Position": 201284866,
"GeneNamesList": [
"CASP8"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs1045485": "https://www.snpedia.com/index.php/Rs1045485"
}
},
{
"RSIDsList": [
4988235
],
"Chromosome": 2,
"Position": 135851076,
"GeneNamesList": [
"MCM6"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs4988235": "https://www.snpedia.com/index.php/Rs4988235"
}
},
{
"RSIDsList": [
7349332
],
"Chromosome": 2,
"Position": 218891661,
"GeneNamesList": [
"WNT10A"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs7349332": "https://www.snpedia.com/index.php/Rs7349332"
}
},
{
"RSIDsList": [
2422241
],
"Chromosome": 2,
"Position": 119043036,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
16863422
],
"Chromosome": 2,
"Position": 222990015,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12694574
],
"Chromosome": 2,
"Position": 222993733,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1105879
],
"Chromosome": 2,
"Position": 234602202,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
974448
],
"Chromosome": 2,
"Position": 223005314,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1005999
],
"Chromosome": 2,
"Position": 105523791,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2070959
],
"Chromosome": 2,
"Position": 234602191,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1978859
],
"Chromosome": 2,
"Position": 223082331,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2894450
],
"Chromosome": 2,
"Position": 222997104,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2422239
],
"Chromosome": 2,
"Position": 119029079,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
892839
],
"Chromosome": 2,
"Position": 239406446,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10209564
],
"Chromosome": 2,
"Position": 239459603,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
121908120
],
"Chromosome": 2,
"Position": 219755011,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12614022
],
"Chromosome": 2,
"Position": 222618951,
"GeneNamesList": [
"FARSB"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6749293
],
"Chromosome": 2,
"Position": 172302075,
"GeneNamesList": [
"LOC107985960"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
112747614
],
"Chromosome": 2,
"Position": 206085512,
"GeneNamesList": [
"INO80D"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
74409360
],
"Chromosome": 2,
"Position": 238367637,
"GeneNamesList": [
"TRAF3IP1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
13016869,
56853446,
56528773
],
"Chromosome": 2,
"Position": 46006242,
"GeneNamesList": [
"PRKCE"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,234 +0,0 @@
[
{
"RSIDsList": [
4053148
],
"Chromosome": 20,
"Position": 8772544,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4911414
],
"Chromosome": 20,
"Position": 32729444,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4911442
],
"Chromosome": 20,
"Position": 33355046,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2748901
],
"Chromosome": 20,
"Position": 4948248,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1015092
],
"Chromosome": 20,
"Position": 8750062,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
911020
],
"Chromosome": 20,
"Position": 49671946,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6058017
],
"Chromosome": 20,
"Position": 32856998,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2327089
],
"Chromosome": 20,
"Position": 8769180,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6020957
],
"Chromosome": 20,
"Position": 49687635,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2327101
],
"Chromosome": 20,
"Position": 8734263,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6039266
],
"Chromosome": 20,
"Position": 8766071,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6056066
],
"Chromosome": 20,
"Position": 8738169,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
975633
],
"Chromosome": 20,
"Position": 8765289,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4633993,
111186477
],
"Chromosome": 20,
"Position": 8789461,
"GeneNamesList": [
"PLCB1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
911015,
60505384
],
"Chromosome": 20,
"Position": 51073634,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6020940,
7271570
],
"Chromosome": 20,
"Position": 51058312,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6056119,
58122852,
6516401
],
"Chromosome": 20,
"Position": 8792648,
"GeneNamesList": [
"PLCB1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6056126,
59241198,
7260663
],
"Chromosome": 20,
"Position": 8795023,
"GeneNamesList": [
"PLCB1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6039272,
7269212
],
"Chromosome": 20,
"Position": 8792227,
"GeneNamesList": [
"PLCB1"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,98 +0,0 @@
[
{
"RSIDsList": [
2252893
],
"Chromosome": 21,
"Position": 38507572,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2835630
],
"Chromosome": 21,
"Position": 38521842,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1003719
],
"Chromosome": 21,
"Position": 38491095,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2835621
],
"Chromosome": 21,
"Position": 38510616,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2832438
],
"Chromosome": 21,
"Position": 31137937,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7277820
],
"Chromosome": 21,
"Position": 38580309,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2835660
],
"Chromosome": 21,
"Position": 37196581,
"GeneNamesList": [
"TTC3"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
622330
],
"Chromosome": 21,
"Position": 43363407,
"GeneNamesList": [
"LINC01679"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,41 +0,0 @@
[
{
"RSIDsList": [
17879961
],
"Chromosome": 22,
"Position": 28725099,
"GeneNamesList": [
"CHEK2"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs17879961": "https://www.snpedia.com/index.php/Rs17879961"
}
},
{
"RSIDsList": [
397723
],
"Chromosome": 22,
"Position": 48112790,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
35051352,
62226058
],
"Chromosome": 22,
"Position": 45973777,
"GeneNamesList": [
"WNT7B"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,231 +0,0 @@
[
{
"RSIDsList": [
4552364
],
"Chromosome": 3,
"Position": 88974863,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
717463
],
"Chromosome": 3,
"Position": 59372700,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
116359091
],
"Chromosome": 3,
"Position": 69980177,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6795519
],
"Chromosome": 3,
"Position": 59388206,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
9858909
],
"Chromosome": 3,
"Position": 88378348,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
13097965
],
"Chromosome": 3,
"Position": 184339757,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
17447439
],
"Chromosome": 3,
"Position": 189549423,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4353811
],
"Chromosome": 3,
"Position": 88981207,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7628370
],
"Chromosome": 3,
"Position": 59370600,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2034127
],
"Chromosome": 3,
"Position": 59368074,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2168809
],
"Chromosome": 3,
"Position": 88377746,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2034129
],
"Chromosome": 3,
"Position": 59368293,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2034128
],
"Chromosome": 3,
"Position": 59368259,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
894883
],
"Chromosome": 3,
"Position": 59373255,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
3912104
],
"Chromosome": 3,
"Position": 42720996,
"GeneNamesList": [
"CCDC13"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7617069
],
"Chromosome": 3,
"Position": 59384969,
"GeneNamesList": [
"CFAP20DC-DT"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
13098099,
60851446
],
"Chromosome": 3,
"Position": 184621879,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7640340,
61716056
],
"Chromosome": 3,
"Position": 59394285,
"GeneNamesList": [
"CFAP20DC-DT"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
875143,
61193087
],
"Chromosome": 3,
"Position": 59394645,
"GeneNamesList": [
"CFAP20DC-DT"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,37 +0,0 @@
[
{
"RSIDsList": [
6828137
],
"Chromosome": 4,
"Position": 90059434,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
141318671
],
"Chromosome": 4,
"Position": 58493393,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
4521336,
58489362
],
"Chromosome": 4,
"Position": 23937776,
"GeneNamesList": [
"PPARGC1A"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,96 +0,0 @@
[
{
"RSIDsList": [
11957757
],
"Chromosome": 5,
"Position": 148216187,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
16891982
],
"Chromosome": 5,
"Position": 33951693,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
348613
],
"Chromosome": 5,
"Position": 40273518,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6555969
],
"Chromosome": 5,
"Position": 171128464,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
26722
],
"Chromosome": 5,
"Position": 33963870,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
28777
],
"Chromosome": 5,
"Position": 33958959,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
72777200
],
"Chromosome": 5,
"Position": 124561295,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
62330021
],
"Chromosome": 5,
"Position": 311787,
"GeneNamesList": [
"PDCD6"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,66 +0,0 @@
[
{
"RSIDsList": [
6918152
],
"Chromosome": 6,
"Position": 542159,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1540771
],
"Chromosome": 6,
"Position": 466033,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12203592
],
"Chromosome": 6,
"Position": 396321,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6910861,
111318576,
63129962,
58859209
],
"Chromosome": 6,
"Position": 10537950,
"GeneNamesList": [
"GCNT2"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
341147,
614213
],
"Chromosome": 6,
"Position": 158420693,
"GeneNamesList": [
"TULP4"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,808 +0,0 @@
[
{
"RSIDsList": [
80034486
],
"Chromosome": 7,
"Position": 117652877,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i5012079",
"i4000311"
]
},
"References": {
"SNPedia.com - rs80034486": "https://www.snpedia.com/index.php/Rs80034486"
}
},
{
"RSIDsList": [
121908745
],
"Chromosome": 7,
"Position": 117559590,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs121908745": "https://www.snpedia.com/index.php/Rs121908745"
}
},
{
"RSIDsList": [
74551128
],
"Chromosome": 7,
"Position": 117548795,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000291",
"i5006050",
"i5011205"
]
},
"References": {
"SNPedia.com - rs74551128": "https://www.snpedia.com/index.php/Rs74551128"
}
},
{
"RSIDsList": [
75096551
],
"Chromosome": 7,
"Position": 117606754,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i5011728",
"i6056297",
"i4000321"
]
},
"References": {
"SNPedia.com - rs75096551": "https://www.snpedia.com/index.php/Rs75096551"
}
},
{
"RSIDsList": [
76713772
],
"Chromosome": 7,
"Position": 117587738,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000317",
"i5011301",
"i6056292"
],
"2": [
"VG07S45090"
],
"3": [
"VG07S45090"
]
},
"References": {
"SNPedia.com - rs76713772": "https://www.snpedia.com/index.php/Rs76713772"
}
},
{
"RSIDsList": [
121909011
],
"Chromosome": 7,
"Position": 117540230,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000296",
"i5006070",
"i5011077"
]
},
"References": {
"SNPedia.com - rs121909011": "https://www.snpedia.com/index.php/Rs121909011"
}
},
{
"RSIDsList": [
75961395
],
"Chromosome": 7,
"Position": 117509123,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000294"
],
"2": [
"VG07S29458"
],
"3": [
"VG07S29458"
]
},
"References": {
"SNPedia.com - rs75961395": "https://www.snpedia.com/index.php/Rs75961395"
}
},
{
"RSIDsList": [
78655421
],
"Chromosome": 7,
"Position": 117530975,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i5010839",
"i5006049",
"i4000295",
"i5010838",
"i5010837"
],
"2": [
"VG07S29628"
],
"3": [
"VG07S29628"
]
},
"References": {
"SNPedia.com - rs78655421": "https://www.snpedia.com/index.php/Rs78655421"
}
},
{
"RSIDsList": [
75039782
],
"Chromosome": 7,
"Position": 117639961,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i5011981",
"i4000325"
],
"2": [
"VG07S52449"
],
"3": [
"VG07S52449"
]
},
"References": {
"SNPedia.com - rs75039782": "https://www.snpedia.com/index.php/Rs75039782"
}
},
{
"RSIDsList": [
80224560
],
"Chromosome": 7,
"Position": 117602868,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000320",
"i5011620"
]
},
"References": {
"SNPedia.com - i4000320": "https://www.snpedia.com/index.php/I4000320",
"SNPedia.com - rs80224560": "https://www.snpedia.com/index.php/Rs80224560"
}
},
{
"RSIDsList": [
77188391
],
"Chromosome": 7,
"Position": 117534366,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000315",
"i5010951"
],
"2": [
"VG07S44986"
],
"3": [
"VG07S44986"
]
},
"References": {
"SNPedia.com - rs77188391": "https://www.snpedia.com/index.php/Rs77188391"
}
},
{
"RSIDsList": [
74597325
],
"Chromosome": 7,
"Position": 117587811,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000306",
"i5006055",
"i5011335",
"i6056294"
],
"2": [
"VG07S29297"
],
"3": [
"VG07S29297"
]
},
"References": {
"SNPedia.com - rs74597325": "https://www.snpedia.com/index.php/Rs74597325"
}
},
{
"RSIDsList": [
121908747
],
"Chromosome": 7,
"Position": 117627581,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000322"
]
},
"References": {
"SNPedia.com - rs121908747": "https://www.snpedia.com/index.php/Rs121908747"
}
},
{
"RSIDsList": [
113993960,
199826652
],
"Chromosome": 7,
"Position": 117559592,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i3000001",
"i5011261"
]
},
"References": {
"SNPedia.com - rs113993960": "https://www.snpedia.com/index.php/Rs113993960"
}
},
{
"RSIDsList": [
77932196
],
"Chromosome": 7,
"Position": 117540270,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000297",
"i5011094",
"i5011095"
]
},
"References": {
"SNPedia.com - rs77932196": "https://www.snpedia.com/index.php/Rs77932196"
}
},
{
"RSIDsList": [
121908748
],
"Chromosome": 7,
"Position": 117590440,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000318",
"i5006139",
"i5011416",
"i5011417",
"i5011418"
]
},
"References": {
"SNPedia.com - rs121908748": "https://www.snpedia.com/index.php/Rs121908748"
}
},
{
"RSIDsList": [
113993959
],
"Chromosome": 7,
"Position": 117587778,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000300",
"i5006109",
"i5011314"
]
},
"References": {
"SNPedia.com - rs113993959": "https://www.snpedia.com/index.php/Rs113993959"
}
},
{
"RSIDsList": [
74767530
],
"Chromosome": 7,
"Position": 117627537,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i5011932",
"i4000308",
"i6056298"
],
"2": [
"VG07S29424"
],
"3": [
"VG07S29424"
]
},
"References": {
"SNPedia.com - rs74767530": "https://www.snpedia.com/index.php/Rs74767530"
}
},
{
"RSIDsList": [
77010898
],
"Chromosome": 7,
"Position": 117642566,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000309",
"i5012037",
"i6056299"
],
"2": [
"VG07S29451"
],
"3": [
"VG07S29451"
]
},
"References": {
"SNPedia.com - rs77010898": "https://www.snpedia.com/index.php/Rs77010898"
}
},
{
"RSIDsList": [
121908746
],
"Chromosome": 7,
"Position": 117592219,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {},
"References": {
"SNPedia.com - rs121908746": "https://www.snpedia.com/index.php/Rs121908746"
}
},
{
"RSIDsList": [
75527207
],
"Chromosome": 7,
"Position": 117587806,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000305",
"i5006054",
"i5011331"
],
"2": [
"VG07S29293"
],
"3": [
"VG07S29293"
]
},
"References": {
"SNPedia.com - rs75527207": "https://www.snpedia.com/index.php/Rs75527207"
}
},
{
"RSIDsList": [
78756941
],
"Chromosome": 7,
"Position": 117531115,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000314",
"i5010909",
"i6056291"
],
"2": [
"VG07S44961"
],
"3": [
"VG07S44961"
]
},
"References": {
"SNPedia.com - rs78756941": "https://www.snpedia.com/index.php/Rs78756941"
}
},
{
"RSIDsList": [
80055610
],
"Chromosome": 7,
"Position": 117587833,
"GeneNamesList": [
"CFTR"
],
"CompanyAliases": {
"1": [
"i4000307",
"i5011358",
"i5011359"
]
},
"References": {
"SNPedia.com - rs80055610": "https://www.snpedia.com/index.php/Rs80055610"
}
},
{
"RSIDsList": [
6944702
],
"Chromosome": 7,
"Position": 83653553,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6462562
],
"Chromosome": 7,
"Position": 4088555,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2854746
],
"Chromosome": 7,
"Position": 45960645,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6462544
],
"Chromosome": 7,
"Position": 4077620,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10237838
],
"Chromosome": 7,
"Position": 4073998,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12155314
],
"Chromosome": 7,
"Position": 4081194,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10266101
],
"Chromosome": 7,
"Position": 4073819,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2108166
],
"Chromosome": 7,
"Position": 42125871,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10485860
],
"Chromosome": 7,
"Position": 4090283,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
9692219
],
"Chromosome": 7,
"Position": 4043701,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7803030
],
"Chromosome": 7,
"Position": 4038558,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2342494
],
"Chromosome": 7,
"Position": 4032591,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6950754
],
"Chromosome": 7,
"Position": 4037491,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7807181
],
"Chromosome": 7,
"Position": 4046812,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10278187,
57744561
],
"Chromosome": 7,
"Position": 4034741,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1562005,
58720272
],
"Chromosome": 7,
"Position": 4044191,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7781059,
57876852,
10351382
],
"Chromosome": 7,
"Position": 4046687,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10237319
],
"Chromosome": 7,
"Position": 4033969,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10237488,
59841339
],
"Chromosome": 7,
"Position": 4034710,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10234405,
58770991
],
"Chromosome": 7,
"Position": 4034827,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1562006,
59417113
],
"Chromosome": 7,
"Position": 4043872,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7779616,
56955120,
17293919,
10377747
],
"Chromosome": 7,
"Position": 4046408,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
10265937
],
"Chromosome": 7,
"Position": 4034017,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
7799331,
58503650,
10365277
],
"Chromosome": 7,
"Position": 4046491,
"GeneNamesList": [
"SDK1"
],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,36 +0,0 @@
[
{
"RSIDsList": [
147068120
],
"Chromosome": 8,
"Position": 81350433,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12543326
],
"Chromosome": 8,
"Position": 42003663,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6997494
],
"Chromosome": 8,
"Position": 12833488,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,108 +0,0 @@
[
{
"RSIDsList": [
12552712
],
"Chromosome": 9,
"Position": 27366436,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
6478394
],
"Chromosome": 9,
"Position": 121836674,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1158810
],
"Chromosome": 9,
"Position": 121809519,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
13297008
],
"Chromosome": 9,
"Position": 12677471,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2762462
],
"Chromosome": 9,
"Position": 12699776,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1408799
],
"Chromosome": 9,
"Position": 12672097,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
1325127
],
"Chromosome": 9,
"Position": 12668328,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
2733832
],
"Chromosome": 9,
"Position": 12704725,
"GeneNamesList": [
"MISSING"
],
"CompanyAliases": {},
"References": {}
},
{
"RSIDsList": [
12335410
],
"Chromosome": 9,
"Position": 129238777,
"GeneNamesList": [],
"CompanyAliases": {},
"References": {}
}
]

View file

@ -1,5 +1,5 @@
// locusMetadata provides information about gene locations.
// locusMetadata provides information about locations in the human genome.
package locusMetadata
@ -9,72 +9,13 @@ import "seekia/internal/helpers"
import _ "embed"
import "encoding/json"
import "encoding/gob"
import "errors"
import "bytes"
//go:embed LocusMetadata_Chromosome1.json
var LocusMetadataFile_Chromosome1 []byte
//go:embed LocusMetadata_Chromosome2.json
var LocusMetadataFile_Chromosome2 []byte
//go:embed LocusMetadata_Chromosome3.json
var LocusMetadataFile_Chromosome3 []byte
//go:embed LocusMetadata_Chromosome4.json
var LocusMetadataFile_Chromosome4 []byte
//go:embed LocusMetadata_Chromosome5.json
var LocusMetadataFile_Chromosome5 []byte
//go:embed LocusMetadata_Chromosome6.json
var LocusMetadataFile_Chromosome6 []byte
//go:embed LocusMetadata_Chromosome7.json
var LocusMetadataFile_Chromosome7 []byte
//go:embed LocusMetadata_Chromosome8.json
var LocusMetadataFile_Chromosome8 []byte
//go:embed LocusMetadata_Chromosome9.json
var LocusMetadataFile_Chromosome9 []byte
//go:embed LocusMetadata_Chromosome10.json
var LocusMetadataFile_Chromosome10 []byte
//go:embed LocusMetadata_Chromosome11.json
var LocusMetadataFile_Chromosome11 []byte
//go:embed LocusMetadata_Chromosome12.json
var LocusMetadataFile_Chromosome12 []byte
//go:embed LocusMetadata_Chromosome13.json
var LocusMetadataFile_Chromosome13 []byte
//go:embed LocusMetadata_Chromosome14.json
var LocusMetadataFile_Chromosome14 []byte
//go:embed LocusMetadata_Chromosome15.json
var LocusMetadataFile_Chromosome15 []byte
//go:embed LocusMetadata_Chromosome16.json
var LocusMetadataFile_Chromosome16 []byte
//go:embed LocusMetadata_Chromosome17.json
var LocusMetadataFile_Chromosome17 []byte
//go:embed LocusMetadata_Chromosome19.json
var LocusMetadataFile_Chromosome19 []byte
//go:embed LocusMetadata_Chromosome20.json
var LocusMetadataFile_Chromosome20 []byte
//go:embed LocusMetadata_Chromosome21.json
var LocusMetadataFile_Chromosome21 []byte
//go:embed LocusMetadata_Chromosome22.json
var LocusMetadataFile_Chromosome22 []byte
//go:embed LocusMetadata.gob
var LocusMetadataFile []byte
type LocusMetadata struct{
@ -93,10 +34,16 @@ type LocusMetadata struct{
// This is a number describing its location on the chromosome it exists on.
Position int
// This is true if we know any information about the gene this locus belongs to, and if there even is a gene
GeneInfoIsKnown bool
// This is true if the locus exists within a gene
// Some loci are non-coding, meaning they don't exist within a gene and code for a protein
GeneExists bool
// A list of gene names which refer to the gene which this locus belongs to.
// Each gene name refers to the same gene.
// Will be a list containing "MISSING" if the gene name has not been added yet
// Will be an empty list if no gene exists
// Will be a nil list if gene info is not known, or no gene exists
GeneNamesList []string
// A list of alternate names for the rsid used by companies
@ -115,7 +62,7 @@ const TwentyThreeAndMe GeneticsCompany = 1
const FamilyTreeDNA GeneticsCompany = 2
const MyHeritage GeneticsCompany = 3
// Map Structure: RSID -> LocusMetadata object
// Map Structure: RSID -> Locus Metadata Object
var lociMetadataMap map[int64]LocusMetadata
// This map stores a list of aliases for rsids which have aliases
@ -146,32 +93,32 @@ func InitializeLocusMetadataVariables()error{
rsidsList := locusObject.RSIDsList
for _, rsid := range rsidsList{
for _, rsID := range rsidsList{
_, exists := lociMetadataMap[rsid]
_, exists := lociMetadataMap[rsID]
if (exists == true){
return errors.New("lociMetadataMap contains duplicate rsid.")
return errors.New("lociMetadataMap contains duplicate rsID.")
}
lociMetadataMap[rsid] = locusObject
lociMetadataMap[rsID] = locusObject
}
if (len(rsidsList) > 1){
// We add rsid aliases to map
for _, rsid := range rsidsList{
for _, rsID := range rsidsList{
rsidAliasesList := make([]int64, 0)
for _, rsidInner := range rsidsList{
if (rsid != rsidInner){
if (rsID != rsidInner){
rsidAliasesList = append(rsidAliasesList, rsidInner)
}
}
rsidAliasesMap[rsid] = rsidAliasesList
rsidAliasesMap[rsID] = rsidAliasesList
}
}
@ -288,124 +235,17 @@ func GetCompanyAliasRSID(companyName string, locusAlias string)(bool, int64, err
return false, 0, errors.New("GetCompanyAliasRSID called with invalid companyName: " + companyName)
}
// This function is only public for use in testing
func GetLocusMetadataObjectsList()([]LocusMetadata, error){
chromosomesList := []int{1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22}
buffer := bytes.NewBuffer(LocusMetadataFile)
locusMetadataObjectsList := make([]LocusMetadata, 0, len(chromosomesList))
for _, chromosomesInt := range chromosomesList{
chromosomeLocusMetadataObjectsList, err := GetLocusMetadataObjectsListByChromosome(chromosomesInt)
if (err != nil){ return nil, err }
locusMetadataObjectsList = append(locusMetadataObjectsList, chromosomeLocusMetadataObjectsList...)
}
return locusMetadataObjectsList, nil
}
func GetLocusMetadataObjectsListByChromosome(chromosome int)([]LocusMetadata, error){
if (chromosome < 1 || chromosome > 22){
chromosomeString := helpers.ConvertIntToString(chromosome)
return nil, errors.New("GetLocusMetadataObjectsListByChromosome called with invalid chromosome: " + chromosomeString)
}
// Outputs:
// -bool: File exists
// -[]byte: File bytes
getFileBytes := func()(bool, []byte){
switch chromosome{
case 1:{
return true, LocusMetadataFile_Chromosome1
}
case 2:{
return true, LocusMetadataFile_Chromosome2
}
case 3:{
return true, LocusMetadataFile_Chromosome3
}
case 4:{
return true, LocusMetadataFile_Chromosome4
}
case 5:{
return true, LocusMetadataFile_Chromosome5
}
case 6:{
return true, LocusMetadataFile_Chromosome6
}
case 7:{
return true, LocusMetadataFile_Chromosome7
}
case 8:{
return true, LocusMetadataFile_Chromosome8
}
case 9:{
return true, LocusMetadataFile_Chromosome9
}
case 10:{
return true, LocusMetadataFile_Chromosome10
}
case 11:{
return true, LocusMetadataFile_Chromosome11
}
case 12:{
return true, LocusMetadataFile_Chromosome12
}
case 13:{
return true, LocusMetadataFile_Chromosome13
}
case 14:{
return true, LocusMetadataFile_Chromosome14
}
case 15:{
return true, LocusMetadataFile_Chromosome15
}
case 16:{
return true, LocusMetadataFile_Chromosome16
}
case 17:{
return true, LocusMetadataFile_Chromosome17
}
//case 18:{
// return true, LocusMetadataFile_Chromosome18
//}
case 19:{
return true, LocusMetadataFile_Chromosome19
}
case 20:{
return true, LocusMetadataFile_Chromosome20
}
case 21:{
return true, LocusMetadataFile_Chromosome21
}
case 22:{
return true, LocusMetadataFile_Chromosome22
}
}
return false, nil
}
fileExists, fileBytes := getFileBytes()
if (fileExists == false){
// No loci exist for this chromosome
emptyList := make([]LocusMetadata, 0)
return emptyList, nil
}
decoder := gob.NewDecoder(buffer)
var locusMetadataObjectsList []LocusMetadata
err := json.Unmarshal(fileBytes, &locusMetadataObjectsList)
if (err != nil) { return nil, err }
err := decoder.Decode(&locusMetadataObjectsList)
if (err != nil){ return nil, err }
return locusMetadataObjectsList, nil
}

Some files were not shown because too many files have changed in this diff Show more