Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!


[VirtEngine/Papers] Emerging Decentralized Autonomous Networks - Page 2
New on LowEndTalk? Please Register and read our Community Rules.

All new Registrations are manually reviewed and approved, so a short delay after registration may occur before your account becomes active.

[VirtEngine/Papers] Emerging Decentralized Autonomous Networks

2

Comments

  • @DETio said:

    @Arkas said:

    @DETio said: Bump to visibility

    Stop bumping the post!

    Just inviting a discussion, some people talked shit then they dissapeared. Why won't they come for another round?

    Nobody is interested? Too much buzzword soup turns people off. It comes across as vaporware.

  • raindog308raindog308 Administrator, Veteran

    @Maounique said: The future is in decentralized virtualization and resources pooling as it was intended when the internet was created with built-in redundancy as well.

    The internet has nothing to do with virtualization or resources. It's just internetworking.

    When the internet was invented, I think the only virtualization in use was on IBM Mainframe systems.

  • FranciscoFrancisco Top Host, Host Rep, Veteran

    @DETio said: Just inviting a discussion, some people talked shit then they dissapeared. Why won't they come for another round?

    You're confused.

    No one was talking shit of the project/product. I don't think you got any comments one way or another about that other than asking how this related to the original control panel you were working on.

    People, like myself, felt it was an insanely slippery slope to start allowing crypto projects to start/advertise on here, regardless of their funding & intentions.

    Francisco

    Thanked by 1iKeyZ
  • MaouniqueMaounique Host Rep, Veteran
    edited June 2023

    @raindog308 said: The internet has nothing to do with virtualization or resources. It's just internetworking.

    So, let's go back to basics and rollback all progress. In theory, the internet has nothing to do with computers either, it is just the communication...

    Internet 3.0 could be enhanced with many protocols, including the virtualization of the communication, which now goes through peer-to-peer encrypted links, those could go over VPNs within meshes, very far from the original signal processing, some of it contains the storage as well, the next step is real virtualization, pooling resources over some communication protocol.

  • @Maounique said:

    @raindog308 said: The internet has nothing to do with virtualization or resources. It's just internetworking.

    So, let's go back to basics and rollback all progress. In theory, the internet has nothing to do with computers either, it is just the communication...

    Internet 3.0 could be enhanced with many protocols, including the virtualization of the communication, which now goes through peer-to-peer encrypted links, those could go over VPNs within meshes, very far from the original signal processing, some of it contains the storage as well, the next step is real virtualization, pooling resources over some communication protocol.

    I think you're both talking about different layers of the OSI model.

  • doghouchdoghouch Member
    edited June 2023

    @Francisco said:

    @DETio said: Just inviting a discussion, some people talked shit then they dissapeared. Why won't they come for another round?

    You're confused.

    No one was talking shit of the project/product. I don't think you got any comments one way or another about that other than asking how this related to the original control panel you were working on.

    People, like myself, felt it was an insanely slippery slope to start allowing crypto projects to start/advertise on here, regardless of their funding & intentions.

    Francisco

    To add to your post: some people will be adamant that crypto/blockchain/web 3.0/whatever are just ‘round the corner. Some people are equally against the idea. Hence, it’s—as you’ve pointed pointed out—a slippery slope to allow here.

    My bias
    Given my first comment, you can probably deduce that I don’t believe in Web 3.0 ever being adopted.
  • @Maounique said:
    Whoever gets this right would have invented internet 3.0.
    The future is in decentralized virtualization and resources pooling as it was intended when the internet was created with built-in redundancy as well.
    The encrypted communications are just a bonus.

    I hope in 3 years I will be able to offer my computer's power to an anonymous network, get credits and spin a VM there whenever I need to access any exclusive resources and the rest of the internet through portals to wikipedia or github for example.

    This is already possible in a way. Look at akash network (crypto/web3). Its docker though, not full VM's.

  • @Cabbage said:
    I really don't understand this weird trend where people try to couple blockchain with AI. I just don't think they mix very well, especially considering how AI tends to be unpredictable, even with strict rules. Service sounds rather centralised, considering how it aims to implement SMS verification (someone needs to send them) and biometrics (I'm pretty sure it's not a reliable and consistent key source). I would just stick to PGP and mail people using it, really.

    I think AI and P2P have a very strong use case, but based on what I have seen so far, I do agree AI in a blockchain makes 0 sense. Its just hype for VC $$$, then dumping on retail gamblers (exit liquidity).

  • MaouniqueMaounique Host Rep, Veteran
    edited June 2023

    @pcfreak30 said: This is already possible in a way. Look at akash network (crypto/web3). Its docker though, not full VM's.

    I advocate this for years, I hope to see it coming true soon, as in an industry standard supported by various big organizations.
    It would be the ultimate model of resilience, no longer limited to any location, any DC, inherently secure and encrypted ALL OVER.

    @TimboJones said: I think you're both talking about different layers of the OSI model.

    Using the OSI analogy, yes, but it doesn't really have to be a Layer 8, 9 etc. albeit that could work perfectly, say layer 8 the mesh over the layer 7, decoupled of the actual communication layers, i.e. the applications would talk to each other at layer 7 and on top of the applications, a mesh, p2p or whatever similar would exist fully encrypted and controlled by the apps and another layer, virtual app layer would communicate on top of said mesh, but there can be other models, for example, everything integrated at layer 8, a distributed system like a virtual node using the computing power of many physical computers facilitated by the apps, completely transparent regarding the CPU make, model, the ultimate HAL.

    It might seem complex, but I am absolutely convinced it is fully doable and not that hard.

    As for the blockchain, that could authenticate the "credits" system, i.e. how much computing power and transfer/storage you put into the system and you could spend that when needed.
    The AI part is kind of unclear to me atm, but it could be used through deep learning to manage bottlenecks and doing load balancing at least.

  • pcfreak30pcfreak30 Member
    edited June 2023

    @Maounique said: I advocate this for years, I hope to see it coming true soon, as in an industry standard supported by various big organizations. It would be the ultimate model of resilience, no longer limited to any location, any DC, inherently secure and encrypted ALL OVER.

    I can at-least speak somewhat authoritatively since I work full time in the blockchain/P2P space. In some aspects I am working directly to build decentralized internet tech.

    I can tell LET is largely disconnected such its in a different universe... but decentralized web/P2P is definitely leveling up and blockchain was a catalyst.

    Kudos

  • MaouniqueMaounique Host Rep, Veteran

    @pcfreak30 said: I can tell LET is largely disconnected such its in a different universe...

    Could you please explain that, I don't understand what you meant.

  • SirFoxySirFoxy Member
    edited June 2023

    @Maounique said:

    @pcfreak30 said: I can tell LET is largely disconnected such its in a different universe...

    Could you please explain that, I don't understand what you meant.

    not enough crypto bros here

  • I've looked at it but it seems to be a solution that's trying to find a problem. I still have yet to see a valid problem or a rationale for this technology or solution. Not shittalking just trying to see why I need to look deeper into this.

  • TimboJonesTimboJones Member
    edited June 2023

    @Maounique said:

    @TimboJones said: I think you're both talking about different layers of the OSI model.

    Using the OSI analogy, yes, but it doesn't really have to be a Layer 8, 9 etc. albeit that could work perfectly, say layer 8 the mesh over the layer 7, decoupled of the actual communication layers, i.e. the applications would talk to each other at layer 7 and on top of the applications, a mesh, p2p or whatever similar would exist fully encrypted and controlled by the apps and another layer, virtual app layer would communicate on top of said mesh, but there can be other models, for example, everything integrated at layer 8, a distributed system like a virtual node using the computing power of many physical computers facilitated by the apps, completely transparent regarding the CPU make, model, the ultimate HAL.

    It might seem complex, but I am absolutely convinced it is fully doable and not that hard.

    Sure, but no new layers are needed. All that you talked about is handled at existing layers already.

    The AI part is kind of unclear to me atm, but it could be used through deep learning to manage bottlenecks and doing load balancing at least.

    Load balancing doesn't need AI. Load balancing doesn't need additional things to go wrong or additional dependence when it's in the critical path. There's already existing load balancing algorithms and doesn't need new shit to break something that should be robust already.

  • MaouniqueMaounique Host Rep, Veteran

    @TimboJones said: Load balancing doesn't need AI.

    Oh, but we are talking about of a whole new level of load balancing, numbers of spares, self-healing, constantly watching availability and striking a balance so that doesn't cost too many resources, circuits, millions of them, there would be huge fails when the Indian government has some spasm and closes the internet, other places as well, must learn the stable areas, sort out things according to the hour of the day, sure, algorithms could be devised to do that, but deep learning could do things much better.

    @TimboJones said: Sure, but no new layers are needed. All that you talked about is handled at existing layers already.

    See, this is why you think the existing load balancing could do, you do not realize the complexity of all this.
    Ultimately, yes, we can handle everything at the application layer, but we could also simplify things and why not remove the layer 7? After all, the internet is about the links, not the applications...
    I don't say it is absolutely needed, but it could make things more clear.

  • @SirFoxy said:

    @Maounique said:

    @pcfreak30 said: I can tell LET is largely disconnected such its in a different universe...

    Could you please explain that, I don't understand what you meant.

    not enough crypto bros here

    Funny, though slightly true regarding the culture. Im not a trader anyways (Im a dev) but theres plenty of them :D .

  • @HalfEatenPie said:
    I've looked at it but it seems to be a solution that's trying to find a problem. I still have yet to see a valid problem or a rationale for this technology or solution. Not shittalking just trying to see why I need to look deeper into this.

    Can you please clarify what you are referring to exactly?

  • MaouniqueMaounique Host Rep, Veteran

    Leave him, there is no real problem to solve with this, there are tools for privacy, virtualization, redundancy, storage, load-balancing, authentication and assuring fair share etc.

    We COULD use them as disparate things, heck, in some areas, less complexity could be good, maybe even excellent, but the whole would be more than the sum of its parts.

    We used to have computers, phone lines, mainframes, the military communications, other types of communications and we put these together and made the internet.

    Now we need to step further in the integration game.

  • @pcfreak30 said:

    @HalfEatenPie said:
    I've looked at it but it seems to be a solution that's trying to find a problem. I still have yet to see a valid problem or a rationale for this technology or solution. Not shittalking just trying to see why I need to look deeper into this.

    Can you please clarify what you are referring to exactly?

    Provide real world use cases.

  • MaouniqueMaounique Host Rep, Veteran
    edited June 2023

    @TimboJones said: Provide real world use cases.

    I will do that for you:
    1. I want to remain anonymous and have everything censorship-resistant;
    2. I want to participate in a resources sharing system (which includes traffic, storage, computing, vitualisation) without bothering with payments;
    3. I want to have a resilient storage;
    4. I want to have resilient routing over an encrypted layer;
    5. I want to have a vm "in the cloud" for which I don't have to pay anything, just share resources with the network;
    6. I want to have load balancing and equal accessibility from all over the world;
    7. I want my freedom back, my digital life anonymous in a cloud with everything encrypted, no snooping, no tracking, to share only the info I want to share;
    8. I want to have everything in one place, one interface, a one-stop-shop.

    This is possible, a citizen's internet, the ultimate resilience, anonymity, resources sharing, CDN at the price of slower everything and some 3:1 or so contribution only.

    Thanked by 1HalfEatenPie
  • HalfEatenPieHalfEatenPie Veteran
    edited June 2023

    @pcfreak30 said:

    @HalfEatenPie said:
    I've looked at it but it seems to be a solution that's trying to find a problem. I still have yet to see a valid problem or a rationale for this technology or solution. Not shittalking just trying to see why I need to look deeper into this.

    Can you please clarify what you are referring to exactly?

    Sure. But how about in the form of a question.

    What is the actual problem/pain point you're addressing here? What's the reason for using this solution over current solutions available on the market?

    Another example question, what makes VirtEngine different from, let's say Cloud.net's historic federated cloud solution?

    Basically, answering the question "why does this matter?" (Not trying to be a dick but this doesn't come clear to me yet). Again, my take has been that this seems like a solution that was made without addressing a specific problem. It's more like "Hey this technology is cool and interesting let's stack a few methodologies together and see what happens" over "Hey I have this problem and this technology is the right fit for the job". So it falls under the category of "so what?" for me and until I understand or hear why that's the limit of this.

  • @Maounique said:

    @TimboJones said: Provide real world use cases.

    I will do that for you:
    1. I want to remain anonymous and have everything censorship-resistant;
    2. I want to participate in a resources sharing system (which includes traffic, storage, computing, vitualisation) without bothering with payments;
    3. I want to have a resilient storage;
    4. I want to have resilient routing over an encrypted layer;
    5. I want to have a vm "in the cloud" for which I don't have to pay anything, just share resources with the network;
    6. I want to have load balancing and equal accessibility from all over the world;
    7. I want my freedom back, my digital life anonymous in a cloud with everything encrypted, no snooping, no tracking, to share only the info I want to share;
    8. I want to have everything in one place, one interface, a one-stop-shop.

    This is possible, a citizen's internet, the ultimate resilience, anonymity, resources sharing, CDN at the price of slower everything and some 3:1 or so contribution only.

    Double post on my end. Yeah but is this the right technology stack to address this?

  • TimboJonesTimboJones Member
    edited June 2023

    @Maounique said:

    @TimboJones said: Provide real world use cases.

    I will do that for you:
    1. I want to remain anonymous and have everything censorship-resistant;
    2. I want to participate in a resources sharing system (which includes traffic, storage, computing, vitualisation) without bothering with payments;
    3. I want to have a resilient storage;
    4. I want to have resilient routing over an encrypted layer;
    5. I want to have a vm "in the cloud" for which I don't have to pay anything, just share resources with the network;
    6. I want to have load balancing and equal accessibility from all over the world;
    7. I want my freedom back, my digital life anonymous in a cloud with everything encrypted, no snooping, no tracking, to share only the info I want to share;
    8. I want to have everything in one place, one interface, a one-stop-shop.

    This is possible, a citizen's internet, the ultimate resilience, anonymity, resources sharing, CDN at the price of slower everything and some 3:1 or so contribution only.

    You don't seem to be talking about this thread, just stuff you want. Please stick to use cases from OP.

    The system - called VirtEngine, provides a secure and verifiable way to establish and verify the identity of individuals and entities within a blockchain network as well as protect and encrypt sensitive data.

    That's not to protect your anonymity. That's to allow business to happen. This will likely get more funding from unscrupulous actors if they say they can verify individuals...

    You also forget, someone has to make money for their efforts and technology, you're not thinking practical. Have you looked at previous resource pooling projects that fail hard? There's too much overhead and less stability and robustness. This isn't for open source public use to make what you describe, this is using buzzwords to do something not needed. You might use AI as a client to spin up more machines before the need occurs, that doesn't help the DC who has to make large purchases of hardware and have excess capacity all the time.

    VirtEngine also powers a Distributed Computing network and its own Cloud Marketplace system, allowing providers to rent out computing power and consumers to consume computing power via Cloud Services, High Performance Compute, and other integrations.

    Given how much processing power we'll all have in the upcoming years, the move will actually to keep as much of your own data in your own home and do all the ai yourself. That shit will be common in a few years, it's already been on flagship phones for years.

    Thanked by 1BasToTheMax
  • MaouniqueMaounique Host Rep, Veteran

    @TimboJones said: just stuff you want

    True, but the building blocks are here, in the "manifesto".

  • @Maounique said:

    @TimboJones said: just stuff you want

    True, but the building blocks are here, in the "manifesto".

    Hence why I said...

    @HalfEatenPie said:
    I've looked at it but it seems to be a solution that's trying to find a problem.

    Thanked by 1TimboJones
  • jackbjackb Member, Host Rep
    edited June 2023

    @Maounique said:

    @TimboJones said: just stuff you want

    True, but the building blocks are here, in the "manifesto".

    One consideration to keep in mind is whether there'll be sufficient interest in this sort of thing to actually make it viable, assuming OP actually manages to pull it off.

    I'd vote no to both until proven otherwise. It wouldn't be the first time I've wrongly assumed something will fail, but it would be far from the first time I've been right.

    Even with all best intentions OP is a small outfit (single person?) and the danger of announcing something this early is it never comes to fruition, especially in small or single person teams. You either get the good feeling of achievement from the announcement (but daunted by how much work is still to come), or get disheartened by feedback - and give up either way.

  • @Maounique said:

    @TimboJones said: just stuff you want

    True, but the building blocks are here, in the "manifesto".

    Not really. You wouldn't use AI for any of that, you'd have a carefully designed protocol. Think RFC, not AI.

  • MaouniqueMaounique Host Rep, Veteran

    @TimboJones said: Think RFC, not AI.

    I think the complexity of it (especially in further iterations) would require deep learning.
    But we shall see.

  • @TimboJones said:

    @pcfreak30 said:

    @HalfEatenPie said:
    I've looked at it but it seems to be a solution that's trying to find a problem. I still have yet to see a valid problem or a rationale for this technology or solution. Not shittalking just trying to see why I need to look deeper into this.

    Can you please clarify what you are referring to exactly?

    Provide real world use cases.

    I mean are you referring to virtengine or something else since we have gone a bit off topic... Want to ensure I respond about the right thing...

  • DETioDETio Member

    @TimboJones said:

    @pcfreak30 said:

    @HalfEatenPie said:
    I've looked at it but it seems to be a solution that's trying to find a problem. I still have yet to see a valid problem or a rationale for this technology or solution. Not shittalking just trying to see why I need to look deeper into this.

    Can you please clarify what you are referring to exactly?

    Provide real world use cases.

    Pulled from VirtEngine Protocol paper pages 3+4

    Current Limitations with Background Techniques used in these technologies:
    (1) Traditional Cloud Computing relies on Centralized Entities to build and manage their
    Cloud Computing services, resulting in a Single Point of failure and risk of data
    loss/downtime.
    (2) Decentralized Systems do not provide a method for Identifying and Verifying users
    for KYC (Know Your Customer) regulations required by most government agencies
    when dealing with finance.
    (3) Decentralized Computing through Blockchain does not offer access to HPC, instead
    relies on users to implement custom code such as Smart Contracts to access
    compute power offered by Blockchains.
    (4) Distributed Computing is dependent on centralized entities to manage and deploy
    the network.
    (5) Supercomputers rely on Centralized entities to deploy and manage infrastructure.
    (6) Distributed Computing networks are limited by network interconnection between
    nodes thus making them unsuitable for certain real-time use cases.
    7) Supercomputers are limited by physical infrastructure constraints, however, provide
    higher network interconnection between compute nodes compared to Distributed
    Computing.
    (8) There are no publicly commercially available Distributed Computing systems,
    Folding@Home is an example of an existing Distributed Computing system however
    access is limited to specific Scientists and Research Organizations within the Protein
    Folding space.
    (9) There are no publicly commercially available Supercomputers that can be rented by
    anyone, the only available systems are reserved for scientists and researchers such
    as Summit, Sierra and Frontera supercomputers.
    (10) Supercomputers require tremendous CAPEX funds to implement and deploy.
    (11) Cloud Computing can provide a limited alternative to supercomputer
    capabilities, is limited by the amount of hardware that can be deployed and
    dedicated for HPC tasks.
    (12) Blockchain Proof of Work networks such as Bitcoin are inherently extremely
    wasteful on computing capacity.
    (13) Cloud Computing services can be underutilized, thus leading to wasted
    computing capacity due to the requirements of providers needing additional
    capacity implemented to deal with surges in demand.
    (14) Consumer compute devices such as mobile and PCs computing capacities are
    often underutilized as devices are not always actively used to their full computing
    extent.

    VirtEngine resolves these limitations, thus the use-cases are the pain points caused from the above limitations.

Sign In or Register to comment.