2022-05-31

DefaultDict in twinBASIC crashes compiler

I'm trying to make a defaultdict in twinBASIC - this is one where if a key does not exist in a dictionary then it is populated by a default value according to some type. This is what I've tried:

Private Class DefaultDict(Of T)
    Implements Scripting.Dictionary Via dict
    Private dict As Dictionary
    Private defaultValue As T

    Public Property Get Item(ByVal key As Variant) As Variant Implements Scripting.Dictionary.Item
        If Not dict.Exists(key) Then
            dict(key) = defaultValue
        End If
        Return dict(key)
    End Property
End Class

Called like:

Dim dict As New DefaultDict(Of Long)
dict("foo") += 1 'foo key defaults to 0
Debug.Print dict("foo") 'should be 1

However this just crashes the compiler. What's the proper approach here?



How to use enrich with foreach/iterate to modify SOAP request body?

I want to add multiple tags to SOAP request body from JSON input array.

It should be added under the ActionProperties tag, so I tried to use the enrich mediator with foreach as follows :

<payloadFactory media-type="xml">
            <format>
                <soapenv:Envelope xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/" 
                xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" 
                xmlns:xsd="http://www.w3.org/2001/XMLSchema" 
                xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
                    <soapenv:Header>
                        <Security xmlns="http://schemas.xmlsoap.org/ws/2002/12/secext">
                            <hd:UsernameToken xmlns:hd="http://schemas.xmlsoap.org/ws/2002/12/secext">
                                <hd:Username>$1</hd:Username>
                                <hd:Password>$2</hd:Password>
                            </hd:UsernameToken>
                        </Security>
                    </soapenv:Header>
                    <soapenv:Body>
                        <p857:ExecuteChangesRequest refresh="1" xmlns:p857="http://www.filenet.com/ns/fnce/2006/11/ws/schema">
                            <p857:ChangeRequest id="1" updateSequenceNumber="0">
                                <p857:TargetSpecification classId="ObjectStore" objectId="$3" serializationDuplicate="0"/>
                                <p857:Action autoUniqueContainmentName="1" classId="$4" defineSecurityParentage="0" xsi:type="p857:CreateAction"/>
                                <p857:Action autoClassify="0" checkinMinorVersion="0" xsi:type="p857:CheckinAction"/>
                                <p857:ActionProperties>

                                    <p857:Property propertyId="DocumentTitle" settable="0" xsi:type="p857:SingletonString">
                                        <p857:Value>$5</p857:Value>
                                    </p857:Property>
                                    <p857:Property propertyId="ContentElements" settable="0" xsi:type="p857:ListOfObject">
                                        <p857:Value accessAllowed="0" classId="ContentTransfer" dependentAction="Insert" newIndex="0" originalIndex="0" serializationDuplicate="0" updateSequenceNumber="0">
                                            <p857:Property propertyId="ContentType" settable="0" xsi:type="p857:SingletonString">
                                                <p857:Value>$7</p857:Value>
                                            </p857:Property>
                                            <p857:Property propertyId="RetrievalName" settable="0" xsi:type="p857:SingletonString">
                                                <p857:Value>$5</p857:Value>
                                            </p857:Property>
                                            <p857:Property propertyId="Content" settable="0" xsi:type="p857:ContentData">
                                                <p857:Value xsi:type="p857:InlineContent">
                                                    <p857:Binary>$6</p857:Binary>
                                                </p857:Value>
                                            </p857:Property>
                                        </p857:Value>
                                    </p857:Property>
                                </p857:ActionProperties>
                                <p857:RefreshFilter levelDependents="0" maxElements="0" maxRecursion="0">
                                    <p857:ExcludeProperties>DateCreated</p857:ExcludeProperties>
                                    <p857:ExcludeProperties>DateLastModified</p857:ExcludeProperties>
                                </p857:RefreshFilter>
                            </p857:ChangeRequest>
                        </p857:ExecuteChangesRequest>
                    </soapenv:Body>
                </soapenv:Envelope>
            </format>
            <args>
                <arg evaluator="xml" expression="get-property('Username')"/>
                <arg evaluator="xml" expression="get-property('Password')"/>
                <arg evaluator="xml" expression="get-property('ObjectStore')"/>
                <arg evaluator="xml" expression="get-property('DocumentClass')"/>
                <arg evaluator="xml" expression="get-property('FileName')"/>
                <arg evaluator="xml" expression="get-property('FileData')"/>
                <arg evaluator="xml" expression="get-property('ContentType')"/>
            </args>
        </payloadFactory>

<foreach expression="//propertiesList">
        <target>
        <sequence>
               <enrich>
                   <source clone="true" type="inline">                                             
                   <p857:Property propertyId="$1" settable="0" 
                    xmlns:p857="http://www.filenet.com/ns/fnce/2006/11/ws/schema" 
                    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="p857:SingletonString">
                        <p857:Value>$2</p857:Value>
                    </p857:Property>
                   <args>
                        <arg evaluator="xml" expression="//name"/>
                        <arg evaluator="xml" expression="//value"/>
                    </args>
                    </source>
                    <target action="child" xmlns:p857="http://www.filenet.com/ns/fnce/2006/11/ws/schema" 
                     xpath="//p857:ExecuteChangesRequest/p857:ChangeRequest/p857:ActionProperties"/>
                </enrich>                                                                                                                                           
        </sequence>
        </target>
    </foreach>

but nothing gets inserted into the SOAP request. While if I try to enrich a single tag it works fine as follows:

<enrich>
        <source clone="true" type="inline">
            <p857:Property propertyId="isArchived" settable="0" 
            xmlns:p857="http://www.filenet.com/ns/fnce/2006/11/ws/schema" 
            xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="p857:SingletonString">
                <p857:Value>1</p857:Value>
            </p857:Property>
        </source>
        <target action="child" xmlns:p857="http://www.filenet.com/ns/fnce/2006/11/ws/schema" 
        xpath="//p857:ExecuteChangesRequest/p857:ChangeRequest/p857:ActionProperties"/>
    </enrich>

How can I accomplish that?



Laravel get attribute with hasMany

I'm making shop online with Laravel. I made Cart with records user_id and product_id with relations hasMany to products. My problem is that I can't get for example product's name or price, I can only get whole array with products and cart data. Can someone tell me how to get it? Maybe there is a problem with a query or just view syntax.

My migration:

    public function up()
    {
        Schema::create('carts', function (Blueprint $table) {
            $table->id();
            $table->unsignedBigInteger('user_id');
            $table->unsignedBigInteger('product_id');
            $table->timestamps();
            
            $table->foreign('user_id')->references('id')->on('users');
            $table->foreign('product_id')->references('id')->on('products');        
        });

Here is my controller function:

  public function index(Request $request)
    {

        $query = Cart::with('product')->where('carts.user_id', $request->user()->id);
        $query = $query->get();

       return view('cart.index', ['cart' => $query]);
    }

And view to show cart

@extends('app')
@section('content')
    @foreach ($cart as $item)
    <form method="" action="">
        
        <button class="btn btn-outline-dark">X</button>
    </form> 
    @endforeach
@endsection

Model:

class Cart extends Model
{
    use HasFactory;

    public function product() {

        return $this->hasMany(Product::class, 'id', 'product_id');
    }
}

Is there another option for $item['product'] to get only product data?

Forgot to paste what view returns:

[{"id":10,"name":"lklkl","description":"klklkk","img":"przyklad.mo.jpg","price":50,"count":9,"created_at":"2022-05-24T13:13:03.000000Z","updated_at":"2022-05-24T13:13:03.000000Z"}]

I would like to get for example product's name.



Filter (Triple) Nested Collection using Linq C#

I need to filter a List of collections > Repos > Workflows and return the result in the same format

Hopefully the example is fairly clear please shout if you think it needs more detail.

    // All classes have a property 'Name'
    // Filter along the branch for any that match and return only the matching items

    List<Collection> AllCollections = new List<Collection>();
    Collection CollectionA = new Collection();
    Collection CollectionB = new Collection();
    CollectionA.Repos = new List<Repo>{new Repo{Name = "FirstRepo", Workflows = new List<Workflow>{new Workflow{Name = "CI-CD"}, new Workflow{Name = "Tests"}, new Workflow{Name = "First-Ops"}}}, new Repo{Name = "SecondRepo", Workflows = new List<Workflow>{new Workflow{Name = "CI-CD"}, new Workflow{Name = "Testing"}, new Workflow{Name = "Second-Ops"}}}, new Repo{Name = "ThirdRepo", Workflows = new List<Workflow>{new Workflow{Name = "CI-CD"}, new Workflow{Name = "Testers"}, new Workflow{Name = "Third-Ops"}}}};
    CollectionB.Repos = new List<Repo>{new Repo{Name = "FronEndUI", Workflows = new List<Workflow>{new Workflow{Name = "CD"}, new Workflow{Name = "UI-Tests"}, new Workflow{Name = "first-Op"}}}, new Repo{Name = "API", Workflows = new List<Workflow>{new Workflow{Name = "CI"}, new Workflow{Name = "Testing"}, new Workflow{Name = "second-Op"}}}, new Repo{Name = "VisualBasic", Workflows = new List<Workflow>{new Workflow{Name = "Deploy"}, new Workflow{Name = "Copy"}, new Workflow{Name = "third-Op"}}}};
    AllCollections.Add(CollectionA);
    AllCollections.Add(CollectionB);
    // Filter 
    string FilterString = "";  // string FilterString = "Copy" , should return  AllCollections > CollectionB > Repo VisualBasic > Workflow Deploy
    // Result should be List of collections > List of Repos > List of workflows
    List<Collection> result = AllCollections.SelectMany(c => c.Repos.Where(r => r.Name.Contains(FilterString.ToLower())).ToList().SelectMany(w => w.Workflows.Where(w => w.Name.Contains(FilterString)))).ToList();
    // Dont know how to show nested list in a table sorry
    // This is returning the workflows but not the parent Repo and Repo's Parent, The Collection



public class Collection
{
    public string Name { get; set; }

    public List<Repo> Repos { get; set; }
}

public class Repo
{
    public string Name { get; set; }

    public List<Workflow> Workflows { get; set; }
}

public class Workflow
{
    public string Name { get; set; }
}

See it here dotnetfiddle



how to fill missing date with certain time frequency?

I have a dataframe with columns being date time every 30min or sometime every 10 min. There are missing data and dates for few days. I'd like to fill missing data with zero or NaN with same time frequency. How to do it?

I found below similar question. But difference of my question is time frequency. I'd like to keep using same 30 minutes rather than using period_range for daily data filling.

sometimes, my data has 10 min time frequency. Basically I'd like to fill missing dates with same existing time frequency.

pandas fill missing dates in time series

I am pasting some data here for your reference.

Depth   5/19/2022 18:51 5/19/2022 19:21 5/19/2022 19:51 5/19/2022 20:21 5/19/2022 20:51 5/19/2022 21:21 5/19/2022 21:51 5/25/2022 0:22  5/25/2022 0:52  5/25/2022 1:22  5/25/2022 1:52  5/25/2022 2:22  5/25/2022 2:52
600 200.6   200.6   200.5   200.7   201.2   201 200.7   171.7   171.7   171.4   171 170.7   170.7
601 200.6   200.7   200.6   200.8   201.3   201.1   200.8   171.7   171.9   171.5   171.2   170.7   170.9
602 200.6   200.6   200.6   200.9   201.3   201 200.8   171.6   172.1   171.5   171.3   170.7   171
603 200.7   200.5   200.7   200.9   201.2   200.9   200.8   171.7   172.2   171.6   171.3   170.9   171.1
604 200.7   200.6   200.8   200.9   201.2   200.9   200.8   172 172.3   171.8   171.5   171.1   171.2
605 200.8   200.7   200.8   201 201.1   200.9   200.7   172.3   172.4   172 171.6   171.3   171.4
606 200.9   200.9   201 201.1   201 201 200.8   172.5   172.6   172.2   171.8   171.6   171.6
607 200.9   201 201.1   201.1   201 201.1   200.9   172.7   172.7   172.3   172.1   171.8   171.7
608 200.8   200.9   201.1   201 200.9   200.9   200.9   172.8   172.8   172.3   172.2   171.9   171.8
609 200.8   200.8   201.1   201 200.9   200.8   201 173 172.9   172.4   172.2   172.1   171.8
610 200.7   200.7   201.1   200.9   201 200.8   200.9   173.1   173 172.6   172.2   172.2   172
611 200.6   200.7   200.9   200.9   201.1   201 200.9   173.2   173.1   172.8   172.3   172.3   172.1
612 200.7   200.8   200.9   200.9   201.3   201.2   201 173.3   173.3   173.1   172.5   172.4   172.3
613 200.8   200.9   201 201.1   201.5   201.3   201 173.5   173.3   173.2   172.8   172.6   172.5
614 201.1   201 201.2   201.3   201.7   201.4   201.1   173.7   173.4   173.3   172.9   172.8   172.7

Thanks

enter image description here



Error: Uncaught (in promise): true in jasmine test case for Angular

I'm new to angular unit testing , was trying to fix the already existing unit test case which fails the error that I getting is Error: Uncaught (in promise): true

The code for the unit test case is as follows

  it('ngOnInit', fakeAsync(() => {
        // fixture.detectChanges();
        component.ngOnInit();
        tick();
        expect(component.dataConfig.industries.length).toEqual(3);
        expect(component.dataConfig.fonts.length).toEqual(2);
      }));

now within ngOnInit a call is made to the function which is as below

checkUserRole=()=>{
    const requiredRoles = ['admin','developer'];
     this.hasAdminRole = this.service.checkRequiredRoles(
      requiredRoles,
      false
    );
  }

and within the spec.ts file they have created the stub and provided it in the provider as below

TestBed.configureTestingModule({
      declarations: [ClientConfigComponent],
      imports: [TestingModule, ngfModule, MatDialogModule,
        MaterialModule,
        FormsModule,
        ToastrModule.forRoot(),
        BrowserAnimationsModule,
        ReactiveFormsModule,
        SharedModule
      ],
      providers: [{
        provide: MatDialogRef,
        useValue: mockDialogRef
      }, {
        provide: MAT_DIALOG_DATA, useValue: {}
      },
      { provide: HttpClient, useClass: FakeHttpService },
      {
        provide: MasterService,
        useValue: masterService
      }
      ]
    })

Stub

const masterService = {
    loadTranslationConfiguration(): any {
      return of({});
    },
    checkRequiredRoles: (requiredRoles, arg) => ({}),
    saveTag(): any {
        return of(saveTags);
    },
    
  };

Error in detail

Error: Uncaught (in promise): true
        error properties: Object({ rejection: true, promise: [object Promise], zone: Zone({ _parent: Zone({ _parent: null, _name: '<root>', _properties: Object({  }), _zoneDelegate: _ZoneDelegate({ _taskCounts: Object({ microTask: 0, macroTask: 0, eventTask: 0 }), zone: <circular reference: Object>, _parentDelegate: null, _forkZS: null, _forkDlgt: null, _forkCurrZone: null, _interceptZS: null, _interceptDlgt: null, _interceptCurrZone: null, _invokeZS: null, _invokeDlgt: null, _invokeCurrZone: null, _handleErrorZS: null, _handleErrorDlgt: null, _handleErrorCurrZone: null, _scheduleTaskZS: null, _scheduleTaskDlgt: null, _scheduleTaskCurrZone: null, _invokeTaskZS: null, _invokeTaskDlgt: null, _invokeTaskCurrZone: null, _cancelTaskZS: null, _cancelTaskDlgt: null, _cancelTaskCurrZone: null

I tried returning true from the stub of checkRequiredRoles using the of operator of the observable still it gave me same issue,I tried returning the promise the checkRequiredRoles method in service and using .then in stub it gave undefined checkRequiredRoles().then

I'm somewhat sure that error is with checkRequiredRoles function returning a boolean value and not a promise

public checkIfCurrentUserHasRequiredRoles(
    role: string[],
    RequiredRole: boolean
  ): boolean {
    
    //Some Logic

    return isTrue;
  }

What should i do to fix this issue where i'm going wrong ,thanks in advance for help



2022-05-30

powershell cannot change directory if path contains [ whereas dos command can [duplicate]

I don't understand why I cannot in Powershell go into a folder with [] for example

cd c:\test\[demo] 

wheras I have created in Powershell with

md [demo]

and I can actually cd with dos command.

So what can I do if I want to navigate in this folder from Powershell ?



Query sqlalchemy and count distinct results

I want to write a query to count how many items a User has organized by Product.title.

This function gives exactly what I want. I feel like there is a way I can use func.count to make this into 1 line.

    def cart(self):
        items = {}
        for product in Product.query.join(Item).filter(Item.user==self):
            item=Item.query.filter(Item.user==self, Item.product==product)
            items[product.title] = {'id': product.id,'count': item.count(), 'cost': product.cost}
        return items

This is my desired return. I've tried joining Product and Item, but I just get the distinct Product returns with no ability to count. Any suggestions?

    Product.title: count
    Apples: 10
    Bananas: 5
    Hotdogs: 1

Tables:

class Item(db.Model):
    __tablename__ = 'item'

    id = db.Column(db.Integer, primary_key=True)

    product_id = db.Column(db.Integer, db.ForeignKey('product.id'), nullable=False)
    product = db.relationship("Product")

    user_id = db.Column(db.Integer, db.ForeignKey('user.id'))
    user = db.relationship("User")

class Product(db.Model):
    __tablename__ = 'product'

    id = db.Column(db.Integer, primary_key=True)
    title = db.Column(db.String, nullable=False)

    items = db.relationship('Item', back_populates='product', lazy='dynamic')

class User(db.Model):
    id = db.Column(db.Integer, primary_key=True)

    items = db.relationship('Item', back_populates='user', lazy='dynamic')


Is there any way to export pandas dataframe into database with existing tables?

Dear Stackoverflow Community, i'm trying to export my dataframe into postgresql database, i used SQLAlchemy but it doesnt give me the opportunity to map the dataframe with the existing tables in the database, for exemple this mt dataframe:

ClientNumber ClientName Amout
        1000     Albert   5000     
        2000       John   4000
        1200   Cristian   1000

and the database have this table :

id_client  client_name   client_amount
     1000       Albert            5000     
     2000         John            4000
     1200     Cristian            1000

The question is how to link my dataframe to postgresql without forcing to change the name columns of the dataframe ? Thanks in advance



svelte can't display image from fastify

i want to fetch image from fastify and display it in svelte, when i visit the server url(http://localhost:3000/image/Darkaron.jpg), it worked, but when i visit it from svelte url(http://localhost:8080/image/Darkaron.jpg), it refuse to show and give me this error.

https://i.postimg.cc/HsKwYkpK/image-error.png

server code:

import fastify from "fastify"
import fstatic from "@fastify/static"
import cors from "@fastify/cors";

const imagePath = '/storage'

type FileResponse = {
    filename: string
}

const server = fastify({
    logger: true
})
server.register(fstatic, {
    root: imagePath
})
server.register(FastifyMultipart)
server.register(cors, {
    origin: "*",
    methods: ["OPTIONS", "GET", "POST"]
})

server.get("/image/:filename", (req, res) => {
    res.sendFile((req.params as FileResponse).filename)
})

const start = async () => {
    try {
        await server.listen(3000)
    } catch (err) {
        server.log.error(err)
        process.exit(1)
    }
}
start()

svelte code

<script lang="ts">
import axios from "axios";
import { onMount } from "svelte";

onMount(async function() {
    let res = await axios({
        method: 'GET',
        url: 'http://localhost:3000/image/Darkaron.jpg'
    })
    console.log(res.data)
})
</script>

<div>check console</div>


Click (Meta) Command to run a list of commands

I need to be able to trigger multiple click commands from one command on the CLI

Let's say I have a click group

@click.group()
def cli():
    pass

@cli.command()
def a():
    print("A")

@cli.command()
def b():
    print ("B")

What functionality should I add to run an ordered list of the commands like the following?

$ python -m my_cli_module a,b
A
B

The goal is that there are shared variables which get initialized for each of my commands. The init is expensive and I'd like to run the init exactly once.



i'm trying to make a axiosGET request to my react component, i get the object on the console.log. But when i try to render it i get a "is not defined"

//component 
const Clientslist = () => {
  const classes = useStyles()

  axios.get('/api/clients').then(resp => {
    const {clients} = resp.data
    console.log(clients) // i get the data on the terminal
    
  })

    return(
        ...
       {
          clients.map(client => ( //clients is not defined
              <Grid key={client._id} item xs={12} sm={6} md={4}>
                  <Card 
                    clientName={client.clientName}
                    ...
          )
       }

//controller 
   const get = async (req, res) => {
     await dbConnect()
     const clients = await ClientsModel.find()
     res.status(200).json({ success: true, clients})
   }

I thing my request code is poor, if someone helps me fix the problem and even a code refactor for a better and clean code. It would be great. Thanks.



2022-05-29

Is it possible to do USB-debugging on Android when Phone's USB port is already used?

Problem: I'd like to run an app on my Phone from my laptop, via USB, using Android Studio, while my phone is physically connected to another USB device (a remote controller for a drone).

In this case, I have:

  • My Laptop (MacBook running Android Studio Bumblebee)
  • My Phone (Samsung Galaxy with Android 11)
  • A controller for a DJI drone (which plugs directly into the Phone)

Problem is the phone obviously only has 1 USB port, so can't connect to both the laptop and the controller.

Question: Is it possible (with a USB hub or maybe connecting both phone and controller to computer?) to do usb debugging on phone from laptop while phone communicates via USB to controller?

Note: I have successfully connected laptop-to-phone with WIFI-debugging before, but the connection can be a bit slow and laggy so it would be nice if it was possible with USB. Additionally I am not able to do this while my phone is working as the internet hotspot for the laptop (so I need to bring in yet another hotspot device).



OpenCV - How to create webM with a transparent background?

When I'm using COLOR_RGBA2BGR all works fine,
but transparent background of the GIF become black

Example gif url: https://i.stack.imgur.com/WYOQB.gif

When I'm using COLOR_RGBA2BGRA, then OpenCV will generate an invalid video.

How can I write webM with transparent background?

Conversion works here https://www.aconvert.com/video/gif-to-webm/ so it's possible somehow

import cv2
import imageio as imageio

fourcc = cv2.VideoWriter_fourcc(*'vp09')
output = cv2.VideoWriter("video.webm", fourcc, 30.0, (512, 500))

frames_ = imageio.mimread("crazy.gif")

# frames = [cv2.cvtColor(frame, cv2.COLOR_RGBA2BGRA) for frame in frames_]
frames = [cv2.cvtColor(frame, cv2.COLOR_RGBA2BGR) for frame in frames_]

for frame in frames:
    frame = cv2.resize(frame, (512, 500))
    output.write(frame)
output.release()


Cannot set service data inside subscribe in Angular

I want to set shared service data inside subscribe method my page structure is enter image description here

i have to access data set from one component app.component in home component and header component.

 this.sharedService.setData({title: this.title, logo: this.logo});

app.component.ts

  setData(): void {
    this.http.get(this.baseUrl+'api/content').subscribe(result  => {
      this.title=result['response'].title;
      this.logo=result['response'].logo;
      this.sharedService.setData({title: this.title, logo: this.logo});
    }); 
    
  }

but in this case service data is set when i access it in any other component getting blank data for title and logo but when i pass static data (Not is subscribe method API call) then it's value is getting passed to other components.

Service:

import { Injectable } from '@angular/core';
import { Observable } from 'rxjs';
import { HttpClient } from '@angular/common/http';
import { environment } from '../../environments/environment';
import { BehaviorSubject } from 'rxjs';

export interface SharedData {
  title: string;
  logo: string;

}

@Injectable({
  providedIn: 'root'
})

export class SharedService  {

  private sharedData$ = new BehaviorSubject<SharedData>({title: '', logo: ''});
  sharedData = this.sharedData$.asObservable();

  constructor() { }

  setData(data: SharedData): void {
    this.sharedData$.next(data);
  }
}

Any Solution Thanks



is this a result of firestore latency or normal behavior

I have a form I am using to allow users to add comments to my site. The form has an input field, a textarea field, and a button. When the button is clicked it runs my addComment() function which adds the name, comment, and timestamp to my firestore collection as a new doc.

It seems like after I click the button to add a comment I have to wait a few seconds before I can post another one. If I try to add a new comment too quickly then request doesn't get sent to my firestore collection, but if I wait a few seconds everything works as expected.

I am curious if this is normal behavior? How can I set it up so users can always post comments without having to wait a few seconds? Can someone explain to me what is happening?

Thanks Update: I have been doing some debugging, and I have noticed that both of the functions getVisitorCount() and getUserComments() from the first useEffect run every time I type something into the name or comment input boxes. I have attached screenshots to showcase what is happening.

On the first initial load of the app: enter image description here

After typing something in the name input box: enter image description here

Finally, typing something into the comment box as well: enter image description here

This is not the desired behavior I want these two functions should not be running when I am typing something into either text field. The getUserComments function should only run on the initial render of the app, and whenever the add comment button is clicked. Could this be what is causing the problems I am experiencing?

import React, { useState, useEffect } from "react";
import { NavBar, Footer, Home, About } from "./imports";
import { BrowserRouter as Router, Route, Routes } from "react-router-dom";
import { db } from "./firebase-config";
import {
  collection,
  getDocs,
  doc,
  updateDoc,
  addDoc,
  Timestamp,
} from "firebase/firestore";

export default function App() {
  const [formData, setFormdata] = useState([]);
  const [numberOfVisitors, setnumberOfVistors] = useState([]);
  const [userComments, setUserComments] = useState([]);

  const portfolioStatsRef = collection(db, "portfolio-stats");
  const userCommentsRef = collection(db, "user-comments");

  const currentNumberOfVisitors = numberOfVisitors.map((visitors) => {
    return (
      <h2 className="p-3 mb-0 bg-dark bg-gradient text-white" key={visitors.id}>
        Number of vistors: {visitors.visitor_count}
      </h2>
    );
  });

  const listOfUserComments = userComments.map((comment) => {
    return (
      <li className="list-group-item" key={comment.id}>
        <div className="d-flex w-100 justify-content-center">
          <h5 className="mb-1">{comment.name}</h5>
          <small>{comment.date.toDate().toString()}</small>
        </div>
        <p className="d-flex justify-content-center mb-1">{comment.comment}</p>
      </li>
    );
  });

  useEffect(() => {
    const getVisitorCount = async () => {
      const dataFromPortfolioStatsCollection = await getDocs(portfolioStatsRef);

      setnumberOfVistors(
        dataFromPortfolioStatsCollection.docs.map((doc) => {
          return { ...doc.data(), id: doc.id };
        })
      );
    };

    const getUserComments = async () => {
      const dataFromUserCommentsCollection = await getDocs(userCommentsRef);

      setUserComments(
        dataFromUserCommentsCollection.docs.map((doc) => {
          return { ...doc.data(), id: doc.id };
        })
      );
    };
    getVisitorCount();
    getUserComments();
  }, [numberOfVisitors, portfolioStatsRef, userCommentsRef]);

  useEffect(() => {
    const updateVisitorCount = async () => {
      const portfolioStatsDoc = doc(
        db,
        "portfolio-stats",
        numberOfVisitors[0].id
      );
      const updatedFields = {
        visitor_count: numberOfVisitors[0].visitor_count + 1,
      };
      await updateDoc(portfolioStatsDoc, updatedFields);
    };

    if (!numberOfVisitors.length) return;

    let sessionKey = sessionStorage.getItem("sessionKey");

    if (sessionKey === null) {
      sessionStorage.setItem("sessionKey", "randomString");
      updateVisitorCount();
    }
  }, [numberOfVisitors]);

  const handleFormData = (event) => {
    setFormdata((prevFormData) => {
      return {
        ...prevFormData,
        [event.target.name]: event.target.value,
      };
    });
  };

  const addComment = async () => {
    const newComment = {
      name: formData.name,
      comment: formData.comment,
      date: Timestamp.now(),
    };
    await addDoc(userCommentsRef, newComment);
  };

  return (
    <>
      <div className="d-flex flex-column overflow-hidden min-vh-100 vh-100">
        <NavBar />
        <div className="row">
          <div className="col text-center">
            {numberOfVisitors.length === 0 && (
              <h2 className="p-3 mb-0 bg-dark bg-gradient text-danger">
                Sorry, the Firestore free tier quota has been met for today.
                Please come back tomorrow to see portfilio stats.
              </h2>
            )}
            {currentNumberOfVisitors}
          </div>
        </div>
        <div className="bg-image">
          <div className="postion-relative">
            <main className="flex-grow-1">
              <div className="container-fluid p-0">
                <Router>
                  <Routes>
                    <Route path="/" element={<Home />} />
                    <Route path="/about" element={<About />} />
                  </Routes>
                  <div className="row">
                    <div className="center-items col">
                      <h4 className="">Comments</h4>
                    </div>
                  </div>
                  <div className="row">
                    <div className="center-items col">
                      <div className="comments-container">
                        {userComments.length === 0 && (
                          <h4 className="text-danger bg-dark m-1 p-1">
                            Sorry, the Firestore free tier quota has been met
                            for today. Please come back tomorrow to see
                            portfilio comments.
                          </h4>
                        )}
                        {listOfUserComments}
                      </div>
                    </div>
                  </div>
                  <div className="row">
                    <div className="center-items col">
                      <h4 className="text-dark">Leave a comment</h4>
                    </div>
                  </div>
                  <div className="row">
                    <div className="center-items col">
                      <form className="comment-form">
                        <div className="form-floating mb-3">
                          <input
                            type="text"
                            className="bg-transparent form-control"
                            id="floatingInput"
                            name="name"
                            onChange={handleFormData}
                          />
                          <label htmlFor="floatingInput">Name</label>
                        </div>
                        <div className="form-floating">
                          <textarea
                            className="form-textarea-field bg-transparent form-control mb-1"
                            name="comment"
                            id="floatingTextarea"
                            onChange={handleFormData}
                          />
                          <label htmlFor="floatingTextarea">Comment</label>
                        </div>
                        <div className="d-grid">
                          <button
                            className="btn btn-primary mb-4"
                            onClick={() => addComment()}
                          >
                            Add Comment
                          </button>
                        </div>
                      </form>
                    </div>
                  </div>
                </Router>
              </div>
            </main>
          </div>
        </div>
        <Footer />
      </div>
    </>
  );
}


Changing x axis scale on the fly

I am using the following gnuplot script to plot data from a set of files that are being added to constantly, once every five minutes:

set terminal x11 size 1900, 900
# The plot will not jump to the current window on update.
set term x11 1 noraise
set obj 1 rectangle behind from screen 0,0 to screen 1,1
set obj 1 fillstyle solid 1.0 fillcolor rgbcolor "black"
set grid lc rgb "white"

set key left top
set key textcolor rgb "white"

set border lc rgb "white"

set xtics textcolor rgb "white"
set xtics font "Times,12"
set xtics 1
set xlabel "Hours" textcolor rgb "white"

set ytics textcolor rgb "white"
set ytics font "Times,12"
set ytics 5
set ylabel "Hits" textcolor rgb "white"

set yrange [0:50]
set y2tics font "Times,10"
set y2tics 1                  # Figures on the right side of the plot as well
set y2range [0:50]

plot "/tmp/Stats/One" using ($1)/(12.):2 title "One" with lines, "/tmp/Stats/Two" using ($1)/(12.):2 title "Two" with lines, "/tmp/Stats/Three" using ($1)/(12.):2 title "Three" with lines
pause 300
reread

The tics in the x axis correspond to hourly intervals. This works fine, until the script has been running for a day or so - at which point the tics in the x axis start to look a bit cluttered.

Would it be possible to change this dynamically from within the gnuplot script itself? The idea is that if the script has been running for more than, say, half a day, the tics in the x axis should be present once every two hours, rather than once every hour. And possibly other similar changes later on - e.g. after one week, there should be one tic per day. The text label would have to change consequently.

Is gnuplot capable of this, or are we talking about a shell script-driven approach instead? The latter is obviously possible, but it would be more cumbersome.



2022-05-28

Azure Container Apps Restarts every 30 seconds

I have an Azure Container App that's based on the hosted BackgroundService model. It's essentially just a long running console app that overrides the BackgroundService.ExecuteAsync method and waits for the stop signal (via the passed cancellation token). When I run locally in Docker, it's perfect - everything runs as expected. When I deploy as an Azure Container App, it deploys and runs - although I manually had to set the scale minimum to 1 to get it to run at all - but it restarts every 30 secs or so which is obviously not ideal. My guess is that the Azure Container Apps docker host is somehow checking my instance for health and isn't satisfied so tries to restart it? Just a guess. What am I missing?

using FR911.DataAccess.Repository;
using FR911.Infrastructure.Commands;
using FR911.Utils;
using FR911.Utils.Extensions;
using SimpleInjector;

IHost host = Host.CreateDefaultBuilder(args)
    .ConfigureServices(services =>
    {
        services.AddFR911Log4NetConfig();        
        services.AddTransient<ICommandProcessor, CommandProcessor>();
        Container container = new Container();
        container.Register(typeof(ICommandHandler<,>), new List<Type>()
            {
                //typeof(CacheSyncCommandHandler),
            });

#if DEBUG
        container.Verify();
#endif

        services.AddSingleton<Container>(container);
        services.AddHostedService<Worker>();
    })
    .Build();

await host.RunAsync();
    public class Worker : BackgroundService
    {
        private readonly ILogger<Worker> _logger;
        private ICommandProcessor _commandProcessor;

        public Worker(ILogger<Worker> logger, ICommandProcessor cmdProcessor)
        {
            _logger = logger;            
            _commandProcessor = cmdProcessor;
        }
        
        protected override async Task ExecuteAsync(CancellationToken stoppingToken)
        {
            _logger.LogInformation("Worker starting at: {time}", DateTimeOffset.Now);

            DateTime? lastGC = null;
            while (!stoppingToken.IsCancellationRequested)
            {
                _logger.LogInformation("Worker running at: {time}", DateTimeOffset.Now);
                await Task.Delay(1000, stoppingToken);
            }
            _logger.LogInformation("Worker stopping at: {time}", DateTimeOffset.Now);
        }
    }
24 May 2022 12:10:46.5732022-05-24 12:10:46,248 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker starting at: 05/24/2022 12:10:46 +00:00
24 May 2022 12:10:46.5732022-05-24 12:10:46,249 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:46 +00:00
24 May 2022 12:10:46.5732022-05-24 12:10:46,251 Microsoft.Hosting.Lifetime fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Application started. Press Ctrl+C to shut down.
24 May 2022 12:10:46.5732022-05-24 12:10:46,252 Microsoft.Hosting.Lifetime fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Hosting environment: Production
24 May 2022 12:10:46.5732022-05-24 12:10:46,336 Microsoft.Hosting.Lifetime fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Content root path: /app
24 May 2022 12:10:47.6402022-05-24 12:10:47,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:47 +00:00
24 May 2022 12:10:48.6402022-05-24 12:10:48,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:48 +00:00
24 May 2022 12:10:49.6392022-05-24 12:10:49,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:49 +00:00
24 May 2022 12:10:50.6432022-05-24 12:10:50,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:50 +00:00
24 May 2022 12:10:51.6422022-05-24 12:10:51,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:51 +00:00
24 May 2022 12:10:52.6412022-05-24 12:10:52,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:52 +00:00
24 May 2022 12:10:53.6622022-05-24 12:10:53,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:53 +00:00
24 May 2022 12:10:54.6402022-05-24 12:10:54,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:54 +00:00
24 May 2022 12:10:55.6382022-05-24 12:10:55,636 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:55 +00:00
24 May 2022 12:10:56.6392022-05-24 12:10:56,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:56 +00:00
24 May 2022 12:10:57.6402022-05-24 12:10:57,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:57 +00:00
24 May 2022 12:10:58.6402022-05-24 12:10:58,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:58 +00:00
24 May 2022 12:10:59.6402022-05-24 12:10:59,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:10:59 +00:00
24 May 2022 12:11:00.6402022-05-24 12:11:00,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:00 +00:00
24 May 2022 12:11:01.6432022-05-24 12:11:01,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:01 +00:00
24 May 2022 12:11:02.6392022-05-24 12:11:02,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:02 +00:00
24 May 2022 12:11:03.6402022-05-24 12:11:03,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:03 +00:00
24 May 2022 12:11:04.6412022-05-24 12:11:04,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:04 +00:00
24 May 2022 12:11:05.6492022-05-24 12:11:05,636 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:05 +00:00
24 May 2022 12:11:06.6642022-05-24 12:11:06,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:06 +00:00
24 May 2022 12:11:07.6392022-05-24 12:11:07,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:07 +00:00
24 May 2022 12:11:08.6402022-05-24 12:11:08,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:08 +00:00
24 May 2022 12:11:09.6402022-05-24 12:11:09,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:09 +00:00
24 May 2022 12:11:10.6412022-05-24 12:11:10,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:10 +00:00
24 May 2022 12:11:11.6392022-05-24 12:11:11,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:11 +00:00
24 May 2022 12:11:12.6402022-05-24 12:11:12,637 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:12 +00:00
24 May 2022 12:11:13.6402022-05-24 12:11:13,638 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:13 +00:00
24 May 2022 12:11:14.6392022-05-24 12:11:14,636 FR911.Worker.Worker fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Worker running at: 05/24/2022 12:11:14 +00:00
24 May 2022 12:11:14.9312022-05-24 12:11:14,930 Microsoft.Hosting.Lifetime fr911worker-app-20--vki2kmn-cf5bff474-5w6mh INFO Application is shutting down...


How to enforce the shape of a JSON/JSONB in Postgres?

I am trying to store the response of questions of a survey in JSON, as they could be in boolean(Is this Bar? Yes/No), number(How much is Foo?), string(Describe what is Foo). It is working fine, but how can I enforce that for a certain question, the JSON will be of identical shape?

For example, for the question "How many Foo or Bar do you eat everyday?", I am expecting the following structure(let say it is column answer):

{
 foo: number,
 bar: number
}

How can I enforce that and keep my data consistent?



Need help implementing ZILN custom loss function in lightGBM

Im trying to implement this zero-inflated log normal loss function based on this paper in lightGBM (https://arxiv.org/pdf/1912.07753.pdf) (page 5). But, admittedly, I just don’t know how. I don’t understand how to get the gradient and hessian of this function in order to implement it in LGBM and I’ve never needed to implement a custom loss function in the past.

The authors of this paper have open sourced their code, and the function is available in tensorflow (https://github.com/google/lifetime_value/blob/master/lifetime_value/zero_inflated_lognormal.py), but I’m unable to translate this to fit the parameters required for a custom loss function in LightGBM. An example of how LGBM accepts custom loss functions— loglikelihood loss would be written as:

def loglikelihood(preds, train_data):
    labels = train_data.get_label()
    preds = 1. / (1. + np.exp(-preds))
    grad = preds - labels
    hess = preds * (1. - preds)
    return grad, hess

Similarly, I would need to define a custom eval metric to accompany it, such as:

def binary_error(preds, train_data):
    labels = train_data.get_label()
    preds = 1. / (1. + np.exp(-preds))
    return 'error', np.mean(labels != (preds > 0.5)), False

Both of the above two examples are taken from the following repository:

https://github.com/microsoft/LightGBM/blob/e83042f20633d7f74dda0d18624721447a610c8b/examples/python-guide/advanced_example.py#L136

Would appreciate any help on this, and especially detailed guidance to help me learn how to do this on my own.



Using Yarn v3.1.1 and GitHub Actions CI with node.js.yml?

I use Yarn v3.1.1 package manager and recently setup a Node.js CI pipeline with GitHub Actions.

How do I get the CI test to use the dependencies generated by yarn install instead of looking for package-lock.json generated by npm install?

Below is my node.js.yml file. I've tried changing the npm references to yarn and deleting the package-lock.json file, but the runner still looks for the package-lock.json file.

My end goal is to use one package manager, preferably Yarn.

EDIT: For clarification, my current workflow requires me to run npm install before pushing to origin so that the CI runs correctly and running yarn install before I can serve my repo locally.

name: Node.js CI

on:
  push:
    branches: [ master ]
  pull_request:
    branches: [ master ]

jobs:
  build:

    runs-on: ubuntu-latest

    strategy:
      matrix:
        node-version: [16.x]
        # See supported Node.js release schedule at https://nodejs.org/en/about/releases/

    steps:
    - uses: actions/checkout@v3
    - name: Use Node.js $
      uses: actions/setup-node@v3
      with:
        node-version: $
        cache: 'npm'
    - run: npm ci
    - run: yarn run build
    - run: yarn run test:headless


Placing 3 pictures under big picture [closed]

Plan was to have 3 pictures direct under the hero picture. I think it is possible, but do not know how to do it...)))

Thank you for your help and time!



Why Solana RPC write Transaction failed to sanitize accounts offsets correctly?

I tried to copy my token swap transaction. And I get an error:

solana.rpc.core.RPCException: {'code': -32602, 'message': 'invalid transaction: Transaction failed to sanitize accounts offsets correctly'}

Successful Transaction - https://solscan.io/tx/5yL5MnDyLmDGu3xeR4gz8y3pJWGg5qXTzxMgd9qUPgYsPQ5ZUPkJt1DVHkmRAoqZUenHtDqgb4vsfkC1P2vjmKW9

Code:

from solana.transaction import AccountMeta, Transaction, TransactionInstruction
from solana.rpc.types import TxOpts
from solana.account import Account
from solana.rpc.api import Client
from solana.publickey import PublicKey
from solana.rpc.commitment import Recent, Root
from solana.keypair import Keypair
import base58
from spl.token.instructions import transfer, TransferParams

url = 'https://api.mainnet-beta.solana.com'
client = Client(url)

txn = Transaction(recent_blockhash=client.get_recent_blockhash()['result']['value']['blockhash'], fee_payer='8noB4Wv7rpUsz17eRD833G7VHnnmfp1JdstyXzuDvfoY')


byte_array = base58.b58decode('myprivatekey')
keypair = list(map(lambda b: int(str(b)), byte_array))[:]

account = Keypair(keypair[0:32])

txn.add(
    TransactionInstruction(
        keys=[
            AccountMeta(pubkey=PublicKey('6vdR9SgRuBtdiR6rgpXLcdjQgV7Ja7GtjqjL5EP3uQwK'), is_signer=False, is_writable=True),
            AccountMeta(pubkey=PublicKey('8noB4Wv7rpUsz17eRD833G7VHnnmfp1JdstyXzuDvfoY'), is_signer=True, is_writable=True),
            AccountMeta(pubkey=PublicKey('Dv3UEMWoX4zwviYDG1ZpdBNiNjmUbkoAmohZLPar3oyA'), is_signer=False, is_writable=True),
            AccountMeta(pubkey=PublicKey('2vfgEPJStq761qrkyh8xedrj9zpew1GQ8CobjtQ4wtyM'), is_signer=False, is_writable=False),
            AccountMeta(pubkey=PublicKey('2x3yujqB7LCMdCxV7fiZxPZStNy7RTYqWLSvnqtqjHR6'), is_signer=False, is_writable=True),
            AccountMeta(pubkey=PublicKey('2x3yujqB7LCMdCxV7fiZxPZStNy7RTYqWLSvnqtqjHR6'), is_signer=False, is_writable=True),
            AccountMeta(pubkey=PublicKey('6UzTmTpFt8pEbdAnstJrusWU1eWggPrNRsyjr9XaPVTX'), is_signer=False, is_writable=True),
            AccountMeta(pubkey=PublicKey('JD8dd73w3JigdsS7rKCqpVzFhzRXZcK2evZzxFURd21x'), is_signer=False, is_writable=True),
            AccountMeta(pubkey=PublicKey('FoXyMu5xwXre7zEoSvzViRk3nGawHUp9kUh97y2NDhcq'), is_signer=False, is_writable=True),
            AccountMeta(pubkey=PublicKey('2GUvz8bAtoeartcAPYifiBWjSVobjEF7jp7uC2cELMCx'), is_signer=False, is_writable=True),
            AccountMeta(pubkey=PublicKey('ATokenGPvbdGVxr1b2hvZbsiqW5xWH25efTNsLJA8knL'), is_signer=False, is_writable=False),
            AccountMeta(pubkey=PublicKey('TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA'), is_signer=False, is_writable=False),
            AccountMeta(pubkey=PublicKey('11111111111111111111111111111111'), is_signer=False, is_writable=False),
            AccountMeta(pubkey=PublicKey('SysvarRent111111111111111111111111111111111'), is_signer=False, is_writable=False),
            
        ],
        program_id=PublicKey('8BYmYs3zsBhftNELJdiKsCN2WyCBbrTwXd6WG4AFPr6n'),
        data=bytes.fromhex('5052c1c9d81b46b801000000000000001027000000000000')
    )
)


inner_instruction_transfer = txn.add(transfer(TransferParams(
    amount=1,
    dest='JD8dd73w3JigdsS7rKCqpVzFhzRXZcK2evZzxFURd21x',
    owner='6vdR9SgRuBtdiR6rgpXLcdjQgV7Ja7GtjqjL5EP3uQwK',
    program_id='TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA',
    source='6UzTmTpFt8pEbdAnstJrusWU1eWggPrNRsyjr9XaPVTX',
)))


from solana.system_program import TransferParams, transfer
inner_instruction_sol_transfer = txn.add(transfer(TransferParams(from_pubkey='8noB4Wv7rpUsz17eRD833G7VHnnmfp1JdstyXzuDvfoY', to_pubkey='Dv3UEMWoX4zwviYDG1ZpdBNiNjmUbkoAmohZLPar3oyA', lamports=9800)))

inner_instruction_sol_transfer2 = txn.add(transfer(TransferParams(from_pubkey='8noB4Wv7rpUsz17eRD833G7VHnnmfp1JdstyXzuDvfoY', to_pubkey='2x3yujqB7LCMdCxV7fiZxPZStNy7RTYqWLSvnqtqjHR6', lamports=200)))



txn.sign(account)
rpc_response = client.send_transaction(
    txn,
    account,
    opts=TxOpts(skip_preflight=True, skip_confirmation=False)
)

print(rpc_response)


2022-05-27

Oracle - drill down the records

I have a table with services and each combination of the services have a specific cost amount. I want to filter one service and get to know, what are the services one level to the left and then choose another service from that subpart, etc.

Please see an example picture: On the left is the process of the "drill down" and on the right is the desired output. Please ignore the sum of the amounts (they are not correct).

example picture

CREATE TABLE test_table (
id              INTEGER,
costcenter      VARCHAR2(20),
service_level1  VARCHAR2(40),
service_level2  VARCHAR2(40),
service_level3  VARCHAR2(40),
service_level4  VARCHAR2(40),
amount          INTEGER);

INSERT INTO test_table (id,costcenter, service_level1, service_level2, service_level3, service_level4, amount)
VALUES ( 1, '10016831', 'U00 COGNOS AL', NULL, NUll, NULL, 50000); 
INSERT INTO test_table (id,costcenter, service_level1, service_level2, service_level3, service_level4, amount)
VALUES ( 2, '10016832', 'EXADATA Basis', 'U00 COGNOS AL', NUll, NULL, 20000); 
INSERT INTO test_table (id,costcenter, service_level1, service_level2, service_level3, service_level4, amount)
VALUES ( 3, '10016833', 'SPLUNK','EXADATA Basis', 'U00 COGNOS AL', NULL, 15000); 
INSERT INTO test_table (id,costcenter, service_level1, service_level2, service_level3, service_level4, amount)
VALUES ( 4, '10016833', 'Linux Basis', 'SPLUNK', 'EXADATA Basis', 'U00 COGNOS AL', 30000); 
INSERT INTO test_table (id,costcenter, service_level1, service_level2, service_level3, service_level4, amount)
VALUES ( 5, '10016833', 'Linux Basis', 'Oracle Admin', 'EXADATA Basis', 'U00 COGNOS AL', 20000); 
COMMIT;


React Testing Library userEvent.type recognizing only the first letter

I'm using "@testing-library/user-event": "^14.2.0" with Next.js 12.

Here is my test

it('test search input', async () => {
    const searchInput = screen.getByPlaceholderText('Search assets');
    expect(searchInput).toBeInTheDocument();

    await userEvent.type(searchInput, 'test{Enter}');
    expect(searchInput).toHaveValue('test');

Test fails with below error

expect(element).toHaveValue(test)

    Expected the element to have value:
      test
    Received:
      t

      195 |
      196 |     await userEvent.type(searchInput, 'test{Enter}');
    > 197 |     expect(searchInput).toHaveValue('test');
          |                         ^
      198 |   });
      199 | });
      200 |

UPDATE: Here is my component code. Component is very simple with an input box with onKeyDown event.

const SearchBox = () => {
      const router = useRouter();
      const handleKeyDown = (event: React.KeyboardEvent<HTMLInputElement>) => {
        const element = event.currentTarget as HTMLInputElement;
        const searchText = element.value;
        if (event.key === 'Enter') {
          router.push({
            pathname: '/search',
            query: { q: searchText },
          });
        }
      };
      return (
        <>
               <input
                className={styles.searchInput}
                type="text"
                placeholder="Search assets"
                onKeyDown={handleKeyDown}
              />
        </>
      );
    };
    
    export default SearchBox;

Can someone please help?



Spring boot and logstash tcp link dosent work with docker compose "localhost/

It works for me in local but i had an error when using docker :

error is : localhost/<unresolved>:5000: connection, how can i set this unresolved value for logstash destination id

docker-compose

 version: '3.2'
    services:
      elasticsearch:
        image: elasticsearch:$ELK_VERSION
        volumes:
          - elasticsearch:/usr/share/elasticsearch/data
        environment:
          ES_JAVA_OPTS: "-Xmx256m -Xms256m"
          # Note: currently there doesn't seem to be a way to change the default user for Elasticsearch
          ELASTIC_PASSWORD: $ELASTIC_PASSWORD
          # Use single node discovery in order to disable production mode and avoid bootstrap checks
          # see https://www.elastic.co/guide/en/elasticsearch/reference/current/bootstrap-checks.html
          discovery.type: single-node
          # X-Pack security needs to be enabled for Elasticsearch to actually authenticate requests
          xpack.security.enabled: "true"
        ports:
          - "9200:9200"
          - "9300:9300"
        healthcheck:
          test: "wget -q -O - http://$ELASTIC_USER:$ELASTIC_PASSWORD@localhost:9200/_cat/health"
          interval: 1s
          timeout: 30s
          retries: 300
        networks:
          - internal
        restart: unless-stopped
      # https://www.elastic.co/guide/en/logstash/current/docker-config.html
      logstash:
        image: logstash:$ELK_VERSION
        ports:
          - "5000:5000"
          - "9600:9600"
        environment:
          LS_JAVA_OPTS: "-Xmx256m -Xms256m"
          ELASTIC_USER: $ELASTIC_USER
          ELASTIC_PASSWORD: $ELASTIC_PASSWORD
          XPACK_MONITORING_ELASTICSEARCH_USERNAME: $ELASTIC_USER
          XPACK_MONITORING_ELASTICSEARCH_PASSWORD: $ELASTIC_PASSWORD
          XPACK_MONITORING_ELASTICSEARCH_HOSTS: "elasticsearch:9200"
          XPACK_MONITORING_ENABLED: "true"
        volumes:
          - ./logstash/pipeline:/usr/share/logstash/pipeline:ro
        networks:
          - internal
        restart: unless-stopped
        depends_on:
          - elasticsearch
    
      # https://www.elastic.co/guide/en/kibana/current/docker.html
      kibana:
        image: kibana:${ELK_VERSION}
        environment:
          ELASTICSEARCH_USERNAME: $ELASTIC_USER
          ELASTICSEARCH_PASSWORD: $ELASTIC_PASSWORD
          # Because Elasticsearch is running in a containerized environment
          # (setting this to false will result in CPU stats not being correct in the Monitoring UI):
          XPACK_MONITORING_UI_CONTAINER_ELASTICSEARCH_ENABLED: "true"
        ports:
          - "5601:5601"
        networks:
          - internal
        restart: unless-stopped
        depends_on:
          - elasticsearch
          - logstash
      mysqldb:
        image: mysql:5.7
        restart: unless-stopped
        env_file: ./.env
        environment:
          - MYSQL_ROOT_PASSWORD=$MYSQLDB_ROOT_PASSWORD
          - MYSQL_DATABASE=$MYSQLDB_DATABASE
        ports:
          - $MYSQLDB_LOCAL_PORT:$MYSQLDB_DOCKER_PORT
        volumes:
          - db:/var/lib/mysql
      app:
        depends_on:
          - mysqldb
        build: ./../
        restart: on-failure
        env_file: ./.env
        ports:
          - $SPRING_LOCAL_PORT:$SPRING_DOCKER_PORT
        environment:
          SPRING_APPLICATION_JSON: '{
            "spring.datasource.url"  : "jdbc:mysql://mysqldb:$MYSQLDB_DOCKER_PORT/$MYSQLDB_DATABASE?useSSL=false",
            "spring.datasource.username" : "$MYSQLDB_USER",
            "spring.datasource.password" : "$MYSQLDB_ROOT_PASSWORD",
            "spring.jpa.properties.hibernate.dialect" : "org.hibernate.dialect.MySQL5InnoDBDialect",
            "spring.jpa.hibernate.ddl-auto" : "update",
            "spring.application.name" : "ebnelhaythem"
          }'
        volumes:
          - .m2:/root/.m2
    
    networks:
      internal:
    
    volumes:
      elasticsearch:
      db:

AND LOGS ARE :

elastic_log_docker-app-1            | 14:15:45,959 |-WARN in net.logstash.logback.appender.LogstashTcpSocketAppender[logstash] - Log destination localhost/<unresolved>:5000: connection
         failed. java.net.ConnectException: Connection refused
        elastic_log_docker-app-1            |   at java.net.ConnectException: Connection refused
        elastic_log_docker-app-1            |   at      at java.base/sun.nio.ch.Net.pollConnect(Native Method)
        elastic_log_docker-app-1            |   at      at java.base/sun.nio.ch.Net.pollConnectNow(Net.java:672)


callback is not a function - castv2

I'm following this http://siglerdev.us/blog/2021/02/26/google-home-message-broadcast-system-node-js/31 which uses this library castv2-client to send messages to my google home. It works. I get the messages no problem, but the code throws

C:\Users\Phil\Documents\google home\node_modules\castv2-client\lib\controllers\receiver.js:72
    callback(null, response.status.volume);
    ^

TypeError: callback is not a function
    at C:\Users\Phil\Documents\google home\node_modules\castv2-client\lib\controllers\receiver.js:72:5                                                                  ver.js:72
    at fn.onmessage (C:\Users\Phil\Documents\google home\node_modules\castv2-client\lib\controllers\request-response.js:27:7)
    at fn.emit (events.js:203:15)
    at Channel.onmessage (C:\Users\Phil\Documents\google home\node_modules\castv2-client\lib\controllers\controller.js:16:10)                                           s\receiver.js:72:5
    at Channel.emit (events.js:198:13)                                              lib\controllers\request-response.js:27:7)
    at Client.onmessage (C:\Users\Phil\Documents\google home\node_modules\castv2\lib\channel.js:23:10)                                                                  ient\lib\controllers\controller.js:16:10)
    at Client.emit (events.js:203:15)
    at PacketStreamWrapper.onpacket (C:\Users\Phil\Documents\google home\node_module\channel.js:23:10)s\castv2\lib\client.js:81:10)
    at PacketStreamWrapper.emit (events.js:198:13)                                  s\castv2\lib\client.js:81:10)
    at TLSSocket.<anonymous> (C:\Users\Phil\Documents\google home\node_modules\castv2\lib\packet-stream-wrapper.js:28:16)                     

What's wrong with the code that is throwing this AND/OR how can I fix it so it's either more graceful in catching error and doesn't throw since the message still delivers to google home or fix it to not throw this at all? I appreciate any help!

I believe it's here in the castv2-client library that it's referencing, but I haven't been able to make it happy.

ReceiverController.prototype.launch = function(appId, callback) {
  this.request({ type: 'LAUNCH', appId: appId }, function(err, response) {
    if(err) return callback(err);
    if(response.type === 'LAUNCH_ERROR') {
      return callback(new Error('Launch failed. Reason: ' + response.reason));
    }
    callback(null, response.status.applications || []);
  });
};

my code

var Client = require('castv2-client').Client;
var DefaultMediaReceiver = require('castv2-client').DefaultMediaReceiver;
const googleTTS = require('google-tts-api');

var App = {
  playin: false,
  DeviceIp: "",
  Player: null,
  GoogleHome: function (host, url) {
    var client = new Client();
    client.connect(host, function () {
      client.launch(DefaultMediaReceiver, function (err, player) {
        client.setVolume({ level: 1 });
        var media = {
            contentId: url,
            contentType: 'audio/mp3',
            streamType: 'BUFFERED'
        };
        App.Player = player;
        App.Player.load(media, { autoplay: true }, function (err, status) {
            App.Player.on('status', function (status) {
                if (status.playerState === "IDLE" && App.playin === false) {
                    client.close();
                }
            });
        });
    });
});
client.on('error', function (err) {
  console.log('Error: %s', err.message);
  client.close();
});
},
run: function (ip, text) {
  App.DeviceIp = ip;
  const url = googleTTS.getAudioUrl(text, {
      lang: 'en-US',
      slow: false,
      host: 'https://translate.google.com',
  });
  App.GoogleHome(App.DeviceIp, url, function (res) {
      console.log(res);
  })
},
broadcast: function(text){
  const ips = '192.168.0.15'.split(","); //From config, 192.168.68.105,192.168.68.107,192.168.68.124
  for (var s of ips){
      App.run(s, text);
  }
}
}

App.broadcast("Broadcasted to all of the devices"); //Only works if you did step 4.5


How to compare two double precision real numbers in Fortran?

I wrote a Fortran subroutine to compute the time of flight(TOF) between two points on elliptical orbit. Obviously this TOF has to be a positive number. I tested my subroutine with different data but in some cases I get negative result despite of I coded a possible way to solve this problem.

Here is my subroutine:

!*************************************************************
    subroutine TOF_between_2TA(e,xn,theta1,theta2,delta_t)
!*************************************************************
!  Compute time of flight bewteen two points (point 1 and point 2) on elliptical orbits.
!*************************************************************
!  Input:
!  xn     = mean motion, any units
!  e      = eccentricity
!  theta1 = true anomaly of first point, in interval [0, twopi]
!  theta2 = true anomaly of second point, in interval [0, twopi]
!
!  Output:
!  delta_t = time of flight between two points (same units as given by xn)
!*************************************************************
 
       
        implicit none

        ! Arguments
        double precision, intent(in)  :: e,xn,theta1,theta2
        double precision, intent(out) :: delta_t

        ! Locals
        integer :: i
        double precision               :: xe,sxe,cxe,cth,sth,den,theta
        double precision, dimension(2) :: theta_vec,xm_vec
        
        !! To get positive time intervals, theta_vec must be sorted in ascending order  
        if (theta1 < theta2) then     
           theta_vec = [theta1, theta2]   ! Case theta1 < theta2
        elseif (theta2 < theta1) then 
           theta_vec = [theta2, theta1]  ! Case theta1 > theta2
        endif
        
        do i=1,2          
            theta = theta_vec(i)
            
            cth = cos(theta)
            sth = sin(theta)
            den = 1.0 + e*cth
            cxe = (e + cth)/den
            sxe = sqrt(1.0-e*e)*sth/den 
            xe = atan2(sxe,cxe)
        
            ! atan2 returns angles in interval -pi, +pi, we need angles in interval [0,2pi]
            xe = mod(xe,twopi)
            if (xe .lt. 0.0_pr) xe = xe + twopi 
        
            xm_vec(i) = xe - e*sxe         
            
        enddo
        
        delta_t= (xm_vec(2)-xm_vec(1)) / xn
        
        return
        
    end subroutine TOF_between_2TA

Can you suggest me a way to increase the robustness of my soubrutine and protect me against undesiderable result, i.e. negative numbers? I'm almost pretty sure that the problem is when I try to compare the vatiables theta1 and theta2, what is the smartest way to compare two real numbers?

Edit - After many tests, I report here the working routines:

!*************************************************************
    subroutine TOF_since_pericenter(e,xn,theta,delta_t_peri)
!*************************************************************
!  Time of Flight since pericenter: From true anomaly to time
!*************************************************************
!  Input:  
!  xn    = mean motion, any units
!  e     = eccentricity
!  theta = true anomaly, in interval [0, twopi]
!
!  Output:
!  delta_t_peri = time since pericenter (same units as given by xn)
!*************************************************************
 
        implicit none

        ! Arguments
        double precision, intent(in)  :: e,xn,theta
        double precision, intent(out) :: delta_t_peri

        ! Locals
        double precision :: xe,sxe,cxe,cth,sth,xm,den
        
        cth = cos(theta)
        sth = sin(theta)
        den = 1.d0 + e*cth
        cxe = (e + cth)/den
        sxe = sqrt(1.d0 - e*e)*sth/den 
        xe  = atan2(sxe,cxe)
        ! atan2 returns angles in interval [-pi, +pi], we need angles in interval [0,2pi]
        xe = mod(xe,twopi)
        if (xe .lt. 0.d0) xe = xe + twopi 
        
        xm = xe - e*sxe
        delta_t_peri = xm/xn
        
        return

    end subroutine ToF_since_pericenter
    
    
    
!*************************************************************
    subroutine TOF_between_2TA(e,xn,theta_1,theta_2,delta_t)
!*************************************************************
!  Compute time of flight required by point 1 to reach the point 2 on elliptical orbits.
!  The point 1 is the moving object and it rotates counterclockwise on its orbit 
!*************************************************************
!  Input:
!  xn        = mean motion, any units
!  e         = eccentricity
!  theta_1   = true anomaly of asteroid, in interval [0, twopi]
!  theta_2   = true anomaly of intersection point, in interval [0, twopi]
!
!  Output:
!  delta_t = time of flight between two points (same units as given by xn)
!*************************************************************
 
       
        implicit none

        ! Arguments
        double precision, intent(in)  :: e,xn,theta_1,theta_2
        double precision, intent(out) :: delta_t

        ! Locals
        integer :: i
        double precision               :: T,delta_t1_peri,delta_t2_peri
        double precision, parameter    :: small = 1.d-30
        
        ! Compute orbital period
        T = twopi / xn
        
        call TOF_since_pericenter(e,xn,theta_1,delta_t1_peri)
        call TOF_since_pericenter(e,xn,theta_2,delta_t2_peri)
        
        delta_t = delta_t2_peri - delta_t1_peri
        
        if (delta_t .gt. small) then     
           delta_t = delta_t
        elseif (delta_t .lt. -small) then 
           delta_t = delta_t + T
        endif
           
        return
        
    end subroutine TOF_between_2TA
 
!*************************************************************

Further comments and improvment suggestions are well accepted.



2022-05-26

Using selenium to find text of a specific size

I am trying to develop a web scraper in Python to scan messari.io and I am trying to get each cell under the Type field. Using dev tools and inspect, it seems like the text-size is 0.7em. How would I go about using the text size to get the text? I have tried to do h4, h5, and h6 and none of those return anything. Here is my code:

types = WebDriverWait(driver, 10).until(
        EC.presence_of_all_elements_located((By.TAG_NAME, "h6")))

    for type in types:
        if type.text:
            print(type.text)

And here is the site for reference: https://messari.io/governor/daos



Create % comparison value between two visuals

I am trying to create a new dynamic comparison metric between two table visuals with identical metrics and custom date slicers that create a period A/B view.

Both tables and date slicers reference the same dataset (tableA). I want to create a measure that can calculate the % difference for all metrics between periods A and B, either as a new table or a series of scorecards under the period B table.

For simplicity, I am only using Cost and Date from the table to create these different periods for comparison.

I am not a DAX expert, so I am running into issues with creating my measure since it relies on the same data set. The closest I got was by duplicating my dataset (tableA (1)) so that I could reference the same metric in my calculation, i.e. %_Change_Cost=(SUM(('tableA'[Cost])/('tableA (1)'[Cost]))-1. But when the date filters do not overlap, the calculation breaks.

Period A vs B tables

Thank you!



Merge multiple list of string in unique list in java 8

Edit2: : I have main data(list or array,it's no matter) like this:

{1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20}

I want to:

1-replace all values containing 3 with "julie"

2-all values that are val % 15 == 0 should be replaced with "jack".

3-also replace all values that are val % 5 == 0 should be replaced with "john" ,

Note: Without If Else Just With Java 8.

In the end I should have this data :

("1","2","julie","4","john","6","7","8","9","john","11","12","julie","14","jack","16","17","18","19","john")

for this issue I use stream and replaced these values with related string and i created 3 related list for each string:

1-Result for need1(replace all values containing 3 with "julie"): ("1","2","julie","4","5","6","7","8","9","10","11","12","julie","14","15","16","17","18","19","20")

2-Result for need2(all values that are val % 15 == 0 should be replaced with "jack"): ("1","2","3","4","5","6","7","8","9","10","11","12","13","14","jack","16","17","18","19","20")

3-Result for need3(replace all values that are val % 5 == 0 should be replaced with "john") :("1","2","3","4","john","6","7","8","9","john","11","12","13","14","john","16","17","18","19","john")

Now I want to have a final result such as below(either with mereg these lists or any other method without If&Else just with java8): :

("1","2","julie","4","john","6","7","8","9","john","11","12","julie","14","jack","16","17","18","19","john")

Thanks!



FCM Custom Notification Icon in flutter

good morning all,

I'm using firebase cloud messaging to send notifications but I'm trying to change its icon and can't do it

I'm trying a lot of solutions but no one is working for me such as :

1-

<meta-data
  android:name="com.google.firebase.messaging.default_notification_icon"
  android:resource="@mipmap/ic_notification" />

2- changing launcher icon

3- modify the launch_background.xml and make the transparent

and a lot of solution, could anyone help me, please?



find Mount point from cat /etc/fstab with ansible

i would create a playbook that Check Mount_points for fstype:ext related to the vars: whitelist so it will iterate through the vars to check if mount_point exists or not if it exists an output should be similar to this, else it will be ignored

/ /boot /home /opt /var /var/opt /var/tmp /var/log /var/log/audit here is my playbook which was using 'xfs' as i don't have ext in my machine. Could you advise about more efficient way to achieve the desired result

  - hosts: all
    vars:
      whitelist:
        - '/'
        - '/boot'
        - '/home'
        - '/opt'
        - '/var'
        - '/bin'
        - '/usr'  

    tasks:
      - set_fact:
          mount_point: "Liquid error: wrong number of arguments (given 1, expected 2)"
      - debug:
          var: mount_point

        loop: ""
        when: item in mount_point

TASK [set_fact] **************************************************************************************************************
ok: [ansible2]
ok: [ansible3]

TASK [debug] *****************************************************************************************************************
ok: [ansible2] => (item=/) => {
    "msg": [
        "/",
        "/boot"
    ]
}
ok: [ansible2] => (item=/boot) => {
    "msg": [
        "/",
        "/boot"
    ]
}
skipping: [ansible2] => (item=/home)
skipping: [ansible2] => (item=/opt)
skipping: [ansible2] => (item=/var)
skipping: [ansible2] => (item=/bin)
skipping: [ansible2] => (item=/usr)
ok: [ansible3] => (item=/) => {
    "msg": [
        "/boot",
        "/"
    ]
}
ok: [ansible3] => (item=/boot) => {
    "msg": [
        "/boot",
        "/"
    ]
}
skipping: [ansible3] => (item=/home)
skipping: [ansible3] => (item=/opt)
skipping: [ansible3] => (item=/var)
skipping: [ansible3] => (item=/bin)
skipping: [ansible3] => (item=/usr)

PLAY RECAP *******************************************************************************************************************
ansible2                   : ok=3    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0
ansible3                   : ok=3    changed=0    unreachable=0    failed=0    skipped=0    rescued=0    ignored=0


How to use own out variable instead of gl_FragColor? [solved]

I want to write a simple shader (I am using Three.js with WebGL as shader language) that colors a cube.

here is an image of this cube

It's working as long as I use gl_FragColor in my FragmentShader, but apparently gl_FragColor should not be used anymore as it is deprecated, so I created my own out variable:

in vec3 pos;
out vec4 outColor;

void main() {
    float r = pos.x;
    float g = pos.y;
    float b = pos.z;
    outColor = vec4(r,g,b,1.0);
}

However, this results in the following error message:

ERROR: 0:44: 'outColor' : must explicitly specify all locations when using multiple fragment outputs

I looked for possible answers and don't really understand this approach:

layout(location = 0) out vec4 outColor;

This gives me the error message

ERROR: 0:44: 'outColor' : conflicting output locations with previously defined output 'pc_fragColor'

but I never declared pc_fragColor. When I use other numbers than 0 (e.g. layout(location = 1)) then the cube is white.

What am I doing wrong?

Solution: I found a solution to my problem. Specifying the GLSLS version in my three.js script when declaring the material helped:

new THREE.ShaderMaterial({ 
     uniforms: uniforms, 
     vertexShader: vShader, 
     fragmentShader: fShader, 
     vertexColors: true, 
     glslVersion: THREE.GLSL3,
});


2022-05-25

vue-advanced-cropper image croped sized is bigger than the original

I'm implementing a system where user choose image, he must crop it before save; i'm using vue-advance-cropper plugin; all system is setted up but the result image sized is bigger than the original;

example: i insert image at 307ko i got 448ko; i insert image at 40ko i got 206ko;

is there any option that i missed to make the result lesser size than the original or is there not anything can be done?



How to get the return value of a task coming from an event loop?

The purpose of this implementation is to be able to call async functions without the "await" keyword
I have a code that is mixing some sync and async functions, I am calling an async function (B) from a sync function (A) inside an event loop and I am unable to get the return value of the async function. An example as follows:

import asyncio
import time


async def B(x):
    print(f'before_delay {x}')
    await asyncio.sleep(1.0)
    print(f'after_delay {x}')
    return x*x

def A(x):
    task = asyncio.create_task(B(x))
    print(task)
    asyncio.get_event_loop().run_until_complete(task) //did not work
    return 0


async def loop_func():
    res = A(9)
    print(f'after calling function {res}')

async def main():
    while True:
        await loop_func()
        await asyncio.sleep(3.0)

asyncio.run(main())

The error I am getting is quite understandable;
RuntimeError: Cannot run the event loop while another loop is running
The problem is that on my program I have a few loop events already running on the background so I cant use asyncio.run() or run_until_complete (or any other low level functions for that matter)
asyncio.wait() would not work as well and also task.done() is never True.
The expected output is:

>>> before_delay 9
>>> after_delay 9 
>>> 81
>>> after calling function 0


How to permanently fix chmod changes to an app deployed on Heroku? [duplicate]

I just made a post yesterday - How to solve "500 internal server error" on a deployed python app on Heroku?

And today I actually have discovered the problem. The program isn't missing, but instead program's access on Heroku has been denied. So, I just did a SSH to Heroku and chmod the exiftool and now it works fine.

But then comes an another problem. This chmod change is only temporary. After a few minutes, the error comes again and I have to do the chmod thing again to make it work.

Is there any way to permanently fix this?



What is predict value of GBM model in R? and why NaN residual?

Well, I have a GBM model for nematode density with some predictor variables (SI = Spectral Index).

However, my model showed "NaN" residual with poisson distribution, and when I used predicted(gbm.fit) or gbm.fit$fit showed continuous values, but I have discrete values.

What should I use, predicted(gbm.fit) or gbm.fit$fit? What does gbm.fit$fit give me?

Can anyone help me with a problem?

This is the gbm algorithm used:

  gbm.fit <- gbm(

  formula = juv ~ NDRE + WI + GRAY + RSVI + VDVI,

  distribution = "poisson",

  data = data_base,

  n.trees = 5000,

  interaction.depth = 15,

  bag.fraction = 3,

  shrinkage = 0.01,

       cv.folds = 5,

  n.cores = NULL, # will use all cores by default

  verbose = FALSE
    )

Then I do:

sqrt(min(gbm.fit$fit))

Which produces this error:

Warning in sqrt(min(gbm.fit$cv.error)) : NaNs produced
[1] NaN
measure = read_xlsx("quantification.xlsx")


KStream disable local state strore

I am using Kafka Stream with Spring cloud Stream. Our application is stateful as it does some aggregation. When I run the app, I see the below ERROR message on the console. I am running this app in a Remote Desktop Windows machine.

Failed to change permissions for the directory C:\Users\andy\project\tmp
Failed to change permissions for the directory C:\Users\andy\project\tmp\my-local-local

But when the same code is deployed in a Linux box, I don't see the error. So I assume it an access issue.

As per our company policy, we do not have access to the change a folder's permission and hence chmod 777 did not work as well.

My question is, is there a way to disable creating the state store locally and instead use the Kafka change log topic to maintain the state. I understand this is not ideal, but it only for my local development. TIA.



2022-05-24

Togglz with Kotlin: Problem to inject some dependency in my custom ActivationStrategy

I have defined the file that references my custom ActivationStrategy in META-INF/Services/ as explained by the Togglz library for custom strategy (https://www.togglz.org/documentation/activation-strategies.html), since I need to resolve whether or not it is activated through another logic that is in another Service. Now how do I inject this service that I need to consume? Since when trying the following:

@Component
class ActivationStrategyByProfile(private val profileService : ProfileService) : ActivationStrategy {

    override fun getId(): String {
        return ID
    }

    override fun getName(): String {
        return NAME
    }

    override fun isActive(
        featureState: FeatureState,
        user: FeatureUser?
    ): Boolean {
        
        val profileId = user?.name
        return profileService.validateProfile(profileId)
    }
    ...

My file in /META-INF/services/org.togglz.core.spi.ActivationStrategy contain:

com.saraza.application.config.ActivationStrategyByProfile

It returns the following error, mentioning that the reference that I have specified in the file, does not include parameters, I understand:

...ActivationStrategyByProfile Unable to get public no-arg constructor

How can I inject the dependency of my service? Can it be done by changing the extended file used by the Java ServiceLoader mechanism? Something like specifying the service as a parameter?



Can {0} initialize a stucture (local) variable several times correctly

I've noticed that inside a library of STM32, there is a piece of the code which initialize a stucture variable with {0}. Below is a simplified example:

   typedef struct
    {
        uint16_t           val_a;  
        uint16_t           val_b;
        uint16_t           val_c;
    } dataset_t;
dataset_t Dataset = {0};

The goal of this code is to initialize all the elements of the variable Dataset to 0. Is this a correct way to initialize this variable ? Is it possible that this method initialize only the first element (val_a) to 0, but not all the elements if we initialize this many times ?



Machine Learning Question on missing values in training and test data

I'm training a text classifier for binary classification. In my training data, there are null values in the .csv file in the text portion, and there are also null values in my test file. I have converted both files to a dataframe (Pandas). This is a small percentage of the overall data (less than 0.01).

Knowing this - is it better to replace the null text fields with an empty string or leave it as as empty? And if the answer is replace with empty string, is it "acceptable" to do the same for the test csv file before running it against the model?



Why doesn't my sslstream get the certificate from a mail server?

From my code below, I should be getting the certificate of the mail server "mailgw.th-nuernberg.de".

That didn't work and I get the error "the handshake failed due to an unexpected packet format" by calling the method "AuthenticateAsClient".

I tried the same code with the mail server "smtp.gmail.com" on port 993. That works and I get the full certificate. The mail server "mailgw.th-nuernberg.de" exists but I don't know why Google's mail server is working and it isn't.

Here is my Code:

X509Certificate2 cert = null;
var client = new TcpClient("mailgw.th-nuernberg.de", 25);
var certValidation = new RemoteCertificateValidationCallback(delegate (object snd, X509Certificate certificate,
            X509Chain chainLocal, SslPolicyErrors sslPolicyErrors)
{
    return true; //Accept every certificate, even if it's invalid
});

// Create an SSL stream and takeover client's stream
using (var sslStream = new SslStream(client.GetStream(), true, certValidation))
{
    sslStream.AuthenticateAsClient("mailgw.th-nuernberg.de", null, System.Security.Authentication.SslProtocols.Tls13 | System.Security.Authentication.SslProtocols.Tls12 | System.Security.Authentication.SslProtocols.Tls11, true);
    var serverCertificate = sslStream.RemoteCertificate;
    cert = new X509Certificate2(serverCertificate);
    //System.Diagnostics.Debug.WriteLine("Heruntergeladenes Zertifikat: " + cert);
}
}
catch (Exception e)
{
    System.Diagnostics.Debug.WriteLine(e.Message);
    //throw some fancy exception ;-)
}

Does anyone know what the problem is? What's the difference using the Google mail server instead of using the mail server from my University?



How can I restrict a time input value?

I want to display an input type hour from 08:00 to 20:00. I tried this:

<input type="time" id="timeAppointment" name = "timeAppointment" min="08:00" max="20:00" placeholder="hour"  required/>

But when I display it I can still select any time, it does not restrict me as I indicate. What is the problem? If is necessary some code I work with Javascript.



2022-05-23

Why does this code execute more slowly after strength-reducing multiplications to loop-carried additions?

I am reading Agner Fog's optimization manuals, and I came across this example:

double data[LEN];

void compute()
{
    const double A = 1.1, B = 2.2, C = 3.3;

    int i;
    for(i=0; i<LEN; i++) {
        data[i] = A*i*i + B*i + C;
    }
}

Agner indicates that there's a way to optimize this code - by realizing that the loop can avoid using costly multiplications, and instead use the "deltas" that are applied per iteration.

I use a piece of paper to confirm the theory, first...

Theory

...and of course, he is right - in each loop iteration we can compute the new result based on the old one, by adding a "delta". This delta starts at value "A+B", and is then incremented by "2*A" on each step.

So we update the code to look like this:

void compute()
{
    const double A = 1.1, B = 2.2, C = 3.3;
    const double A2 = A+A;
    double Z = A+B;
    double Y = C;

    int i;
    for(i=0; i<LEN; i++) {
        data[i] = Y;
        Y += Z;
        Z += A2;
    }
}

In terms of operational complexity, the difference in these two versions of the function is indeed, striking. Multiplications have a reputation for being significantly slower in our CPUs, compared to additions. And we have replaced 3 multiplications and 2 additions... with just 2 additions!

So I go ahead and add a loop to execute compute a lot of times - and then keep the minimum time it took to execute:

unsigned long long ts2ns(const struct timespec *ts)
{
    return ts->tv_sec * 1e9 + ts->tv_nsec;
}

int main(int argc, char *argv[])
{
    unsigned long long mini = 1e9;
    for (int i=0; i<1000; i++) {
        struct timespec t1, t2;
        clock_gettime(CLOCK_MONOTONIC_RAW, &t1);
        compute();
        clock_gettime(CLOCK_MONOTONIC_RAW, &t2);
        unsigned long long diff = ts2ns(&t2) - ts2ns(&t1);
        if (mini > diff) mini = diff;
    }
    printf("[-] Took: %lld ns.\n", mini);
}

I compile the two versions, run them... and see this:

# gcc -O3 -o 1 ./code1.c

# gcc -O3 -o 2 ./code2.c

# ./1
[-] Took: 405858 ns.

# ./2
[-] Took: 791652 ns.

Well, that's unexpected. Since we report the minimum time of execution, we are throwing away the "noise" caused by various parts of the OS. We also took care to run in a machine that does absolutely nothing. And the results are more or less repeatable - re-running the two binaries shows this is a consistent result:

# for i in {1..10} ; do ./1 ; done
[-] Took: 406886 ns.
[-] Took: 413798 ns.
[-] Took: 405856 ns.
[-] Took: 405848 ns.
[-] Took: 406839 ns.
[-] Took: 405841 ns.
[-] Took: 405853 ns.
[-] Took: 405844 ns.
[-] Took: 405837 ns.
[-] Took: 406854 ns.

# for i in {1..10} ; do ./2 ; done
[-] Took: 791797 ns.
[-] Took: 791643 ns.
[-] Took: 791640 ns.
[-] Took: 791636 ns.
[-] Took: 791631 ns.
[-] Took: 791642 ns.
[-] Took: 791642 ns.
[-] Took: 791640 ns.
[-] Took: 791647 ns.
[-] Took: 791639 ns.

The only thing to do next, is to see what kind of code the compiler created for each one of the two versions.

objdump -d -S shows that the first version of compute - the "dumb", yet somehow fast code - has a loop that looks like this:

objdump naive

What about the second, optimized version - that does just two additions?

objdump optimized

Now I don't know about you, but speaking for myself, I am... puzzled. The second version has approximately 4 times fewer instructions, with the two major ones being just SSE-based additions (addsd). The first version, not only has 4 times more instructions... it's also full (as expected) of multiplications (mulpd).

I confess I did not expect that result. Not because I am a fan of Agner (I am, but that's irrelevant).

Any idea what I am missing? Did I make any mistake here, that can explain the difference in speed? Note that I have done the test on a Xeon W5580 and a Xeon E5 1620 - in both, the first (dumb) version is much faster than the second one.

EDIT: For easy reproduction of the results, I added two gists with the two versions of the code: Dumb yet somehow faster and optimized, yet somehow slower.

P.S. Please don't comment on floating point accuracy issues; that's not the point of this question.



Python world analog of Rails encrypted credentials feature (to store secrets securely)

Are the Python analogs of encrypted credentials Rails feature?

Quote from Rails Guides on subject:

Rails stores secrets in config/credentials.yml.enc, which is encrypted and hence cannot be edited directly. Rails uses config/master.key or alternatively looks for the environment variable ENV["RAILS_MASTER_KEY"] to encrypt the credentials file. Because the credentials file is encrypted, it can be stored in version control, as long as the master key is kept safe.

To edit the credentials file, run bin/rails credentials:edit. This command will create the credentials file if it does not exist. Additionally, this command will create config/master.key if no master key is defined.

Secrets kept in the credentials file are accessible via Rails.application.credentials.

My idea is:

  • to have all the secrets encrypted in repository;
  • to have locally only master.key (or only one env variable);
  • to once pass manually to production server master.key;
  • then pass other secrets by git through automated deployment process.


Get value from joined tables, Codeigniter

I'm trying to get and display the data from joined tables but I can't get anything in return. I reference this and it worked on my other function, but when I tried it again on a different function, I can't get any results.

Here's the Model:

public function viewReview($id)
{
    $this->db->select('clients.firstname, clients.lastname, packages.title, rate_review.review, rate_review.date_created');
    $this->db->where('rate_review.id', $id);
    $this->db->join('clients', 'clients.id = rate_review.user_id');
    $this->db->join('packages', 'rate_review.package_id = packages.id');
    $this->db->from('rate_review');

    $query = $this->db->get();
    return $query->row_array();
}

Controller:

public function view_review($id)
{
    $data['title'] = 'Rate & Reviews';
    $data['review'] = $this->Admin_model->viewReview($id);

    $this->load->view('../admin/template/admin_header');
    $this->load->view('../admin/template/admin_topnav');
    $this->load->view('../admin/template/admin_sidebar');
    $this->load->view('../admin/rate_review/view', $data);
    $this->load->view('../admin/template/admin_footer');
}

View:

    <div class="card-body">
  <p>User:
    <?php echo $review['firstname'].' '. $review['lastname']; ?>
  </p>
  <p>Package:
    <?php echo $review['title']; ?>
  </p>
  <div class="m-2">
    <pre class="border-2"><?php echo $review['review']; ?></pre>
    <br>
    <span class="mt-2"><?php echo $review['date_created']; ?></span>
  </div>
</div>