Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "pubsub"
-
I can retire! I automated myself!
I introduce to you, retoorii1b! Yes - I fit in a 1b LLM. Retoorii1b is a bit retoorded tho. It's quite realistic.
I tested several LLM's with same training and it was amazing. Even a 0.5b that had the most interesting Dutch ever. Her Dutch is like my English I suppose.
The 0.5b one could code fine. retoorii1b still has some ethics to delete to make it more realistic.
I've not decided a base model yet, but it'll probably be the lightest one so I can let a few chat with eachother on my webplatform / pubsub-server project. I have a few laptops to host on. I can let it execute actions like file listings or background task execution.
See comments for some very awkward response regarding my file listing. She described everything.
She just said these things. I'm kinda proud. I became a parent:
3. **Keep functions short and sweet**: Aim for functions under 50 lines long. Any longer and you're just wasting people's time.
Now if you'll excuse me, I have more important things to attend to... like coding my next game in Unreal Engine.31 -
I'm writing a devrant like site, so a kind of forum that supports live chat under every article. Login will be just username and password to stay anonymous. Email is optional for password reset. Also it won't have password requirements. Who cares if user uses insecure password. I do like the devrant avatar thing. I will use the ducky generator instead. So everyone on the site is a custom duck. K-SASS prolly never expected his generator to be used anywhere. The requirement of this site is that it scales very well. I have db calls of 0.006s, this is for persistent data only and will be used by all site instances. I expect that it can handle many clients concurrent as long I do not return more than 30 rows or so. Events get handled by a self written pubsub server.
All sounds great and development goes fine. But why is this a rant? Because the same thing as always is biting me, I can't design a site at all. I know how but I don't have any feeling for design at all making me almost incapable of building an attractive site. The only thing I can 'design' is an application in bootstrap or smth. I spend so much time one design while I don't like to do it ironically. But looks of site is almost as important as an good working site. Good working site doesn't get used if looks bad in many casee. This is since the start of my career an issue and it sucks that I appearantly can't deliver a whole site on my own meeting my standards.
My backend work is top notch tho. Btw, this application is not to be an alternative for devrant. I do not think I can attract more users than it already has and I've seen two communities disappearing once because someone decided to make a new one, took half of community with him and both communities died after short while.
End product of this project is a working project, not a live site hosted somewhere. It's pure about mixing mostly self written tech to get the best performance. Reinventing wheel on many levels. I wanted maybe to do the site in C but decided that it's way to much work for the value. I change the site so rapid since I don't have decent plan that python aiohttp is the best choice in amount of writing it yourself and fast. It's very lightweight.
More a story than a rant, sorry29 -
Hi guys, anyone knows about Google App Engine custom runtimes and CloudSQL?
Please answer my question on StackOverflow.
http://stackoverflow.com/questions/...
Looks like no one there cares about helping me...8 -
So, I turned on my old phone to transfer all my p̶o̶r̶n̶ photos, and it crashed from the notifications.
And I understand that it's an underpowered device and notifications aren't expected to come by the hundreds, but how terribly misdesigned must a pubsub system like Push be for the client to crash? -
Just created a Cloudflare worker that sends messages to the backend over PubSub. Feeling pretty bad-ass right now.3
-
It is amazes how much brain can be wasted with react.
In those 7hours (impressed myself by my bullshit withstanding), it took me 20min to understand a fucking api and how objects relate altogether, 1h to make the tut
and 5+fucking hours to understand how to plug the components.
I did use vue and backbone before and am 5y nodejs user.
seriously react is a bag of shitty magic.
I don't even want to try to read the code source yet, this could be the fatal move...
Oh. and also. people have to stop with jsx, it is so so so wrong. new syntax with new errors just for a fucking syntaxic sugar for saving a pair of parenthesis!!!!
like it matters after having installed 1e2+ MB of dependancies for a SPA of 10 components...
The only thing we miss is a react IDE to support JSX. #wheregoesthefront
And I am not even to the point of data flow and pubsub hells which i will be sure will be gold as well8 -
I am trying to extract data from the PubSub subscription and finally, once the data is extracted I want to do some transformation. Currently, it's in bytes format. I have tried multiple ways to extract the data in JSON format using custom schema it fails with an error
TypeError: __main__.MySchema() argument after ** must be a mapping, not str [while running 'Map to MySchema']
**readPubSub.py**
import apache_beam as beam
from apache_beam.options.pipeline_options import PipelineOptions
import json
import typing
class MySchema(typing.NamedTuple):
user_id:str
event_ts:str
create_ts:str
event_id:str
ifa:str
ifv:str
country:str
chip_balance:str
game:str
user_group:str
user_condition:str
device_type:str
device_model:str
user_name:str
fb_connect:bool
is_active_event:bool
event_payload:str
TOPIC_PATH = "projects/nectar-259905/topics/events"
def run(pubsub_topic):
options = PipelineOptions(
streaming=True
)
runner = 'DirectRunner'
print("I reached before pipeline")
with beam.Pipeline(runner, options=options) as pipeline:
message=(
pipeline
| "Read from Pub/Sub topic" >> beam.io.ReadFromPubSub(subscription='projects/triple-nectar-259905/subscriptions/bq_subscribe')#.with_output_types(bytes)
| 'UTF-8 bytes to string' >> beam.Map(lambda msg: msg.decode('utf-8'))
| 'Map to MySchema' >> beam.Map(lambda msg: MySchema(**msg)).with_output_types(MySchema)
| "Writing to console" >> beam.Map(print))
print("I reached after pipeline")
result = message.run()
result.wait_until_finish()
run(TOPIC_PATH)
If I use it directly below
message=(
pipeline
| "Read from Pub/Sub topic" >> beam.io.ReadFromPubSub(subscription='projects/triple-nectar-259905/subscriptions/bq_subscribe')#.with_output_types(bytes)
| 'UTF-8 bytes to string' >> beam.Map(lambda msg: msg.decode('utf-8'))
| "Writing to console" >> beam.Map(print))
I get output as
{
'user_id': '102105290400258488',
'event_ts': '2021-05-29 20:42:52.283 UTC',
'event_id': 'Game_Request_Declined',
'ifa': '6090a6c7-4422-49b5-8757-ccfdbad',
'ifv': '3fc6eb8b4d0cf096c47e2252f41',
'country': 'US',
'chip_balance': '9140',
'game': 'gru',
'user_group': '[1, 36, 529702]',
'user_condition': '[1, 36]',
'device_type': 'phone',
'device_model': 'TCL 5007Z',
'user_name': 'Minnie',
'fb_connect': True,
'event_payload': '{"competition_type":"normal","game_started_from":"result_flow_rematch","variant":"target"}',
'is_active_event': True
}
{
'user_id': '102105290400258488',
'event_ts': '2021-05-29 20:54:38.297 UTC',
'event_id': 'Decline_Game_Request',
'ifa': '6090a6c7-4422-49b5-8757-ccfdbad',
'ifv': '3fc6eb8b4d0cf096c47e2252f41',
'country': 'US',
'chip_balance': '9905',
'game': 'gru',
'user_group': '[1, 36, 529702]',
'user_condition': '[1, 36]',
'device_type': 'phone',
'device_model': 'TCL 5007Z',
'user_name': 'Minnie',
'fb_connect': True,
'event_payload': '{"competition_type":"normal","game_started_from":"result_flow_rematch","variant":"target"}',
'is_active_event': True
}
Please let me know if I m doing something wrong while parsing the data to JSON. Also, I am looking for examples to do data masking and run some SQL within Apache Beam4