I'm coming from asp.net core and I use DependencyInjection to implement inversion of control in my applications. I would like to know if there is any way to provide fastapi with a list of abstractclass-class dependency equivalences so that when some part of my code requires a dependency, it injects by constructor through the abstract class. (similar to how it is done in dot net core)
for more clarity in my question I attach a code snippet (not functional, for explaining purposes only ) of what I want to know if it is possible to do in fastapi:
## abstract class of repository for product
class ABCProductRepository(ABC):
#abstractmethod
def createProduct(self, productData: str):
# class that "implements" ABCProductRepository
class ProductRepository(ABCProductRepository):
def createProduct(self, productData: str):
print(f"I'm creating a new product: {productData}")
# Class provided with a UserRepository instance via dependency injection.
class ProductService():
def __init__(self, productRepo: AbstractUserRepository):
self.pRepo = productRepo:
self.pRepo.createProduct("usb cable")
# Fastapi section
app = FastAPI()
# configuration of dependencies
def config_dependencies(app):
app.imaginary_method_to_add_dependencies(
dependency = ABCProductRepository,
provider = ProductRepository,
other_imaginary_configs = ...
)
configure_dependencies(app)
# routes section
#app.post("/products")
def save_product(product: string):
ProductService()
I don't know if this can be done with Fastapi as it is done in asp net, or if a different independence injection library is required than the one fastapi provides.
The dependency injection technique can be accomplished with FastAPI using the Depends class.
Here's an example of what FastAPI can offer.
Repository:
Start building the repository by combining python's ABC class with a product repo, and assuming the scope is to CRUD a product, then pydantic can help us represent the model using the BaseModel
from abc import ABC, abstractmethod
from fastapi import FastAPI, Depends
from pydantic import BaseModel
class ProductModel(BaseModel):
"""
Pydantic model for request/response
"""
title: str
class ProductRepositoryABC(ABC):
"""
Abstract base product repository
"""
#abstractmethod
def create_product(self, product: ProductModel) -> ProductModel:
raise NotImplementedError
class ProductRepository(ProductRepositoryABC):
"""
Product repository
"""
def create_product(self, product: ProductModel) -> ProductModel:
print(f"I'm creating a new product: {product}")
return product
Service:
After creating the model and the repository, we can start creating a service by injecting the repo as a dependency.
class ProductService(object):
"""
Product service
"""
def __init__(self, product_repo: ProductRepositoryABC = Depends(ProductRepository)):
self.product_repo = product_repo
def create_product(self, product: ProductModel) -> ProductModel:
return self.product_repo.create_product(product=product)
Views/Routes:
Injecting a service into the route function is one step ahead; pass the service to the function as a param and use Depends to inject the service, which will be accessible in the function scope.
app = FastAPI()
#app.post(
"/products",
name="products:create",
response_model=ProductModel
)
def create_product(
product: ProductModel,
product_srv: ProductService = Depends(ProductService)
) -> ProductModel:
return product_srv.create_product(product=product)
Complete code:
main.py
from abc import ABC, abstractmethod
from fastapi import FastAPI, Depends
from pydantic import BaseModel
class ProductModel(BaseModel):
"""
Pydantic model for request/response
"""
title: str
class ProductRepositoryABC(ABC):
"""
Abstract base product repository
"""
#abstractmethod
def create_product(self, product: ProductModel) -> ProductModel:
raise NotImplementedError
class ProductRepository(ProductRepositoryABC):
"""
Product repository
"""
def create_product(self, product: ProductModel) -> ProductModel:
print(f"I'm creating a new product: {product}")
return product
class ProductService(object):
"""
Product service
"""
def __init__(self, product_repo: ProductRepositoryABC = Depends(ProductRepository)):
self.product_repo = product_repo
def create_product(self, product: ProductModel) -> ProductModel:
return self.product_repo.create_product(product=product)
app = FastAPI()
#app.post(
"/products",
name="products:create",
response_model=ProductModel
)
def create_product(
product: ProductModel,
product_srv: ProductService = Depends(ProductService)
) -> ProductModel:
return product_srv.create_product(product=product)
Global settings:
You can use Pydantic to build config-base settings with a class mapper for global settings, and build a utility for string imports, and there's a quite good well-testing functions on the Django community which you can use to import from string or use the import_module directly dynamically from importlib
config.py
from pydantic import BaseSettings
from .utils import import_string #YOUR IMPORT STRING FUNCTION
class Settings(BaseSettings):
app_name: str = "Awesome API"
services = {
"product_repo": {"class": import_string("repositories.ProductRepository")}
}
settings.py
from functools import lru_cache
from . import config
app = FastAPI()
#lru_cache()
def get_settings():
return config.Settings()
Start packaging your module with a more robust structure like the following
services/products.py
from settings import get_settings
settings = get_settings()
class ProductService(object):
"""
Product service
"""
def __init__(self, product_repo: ProductRepositoryABC = Depends(settings.get('services').get('class')):
self.product_repo = product_repo
def create_product(self, product: ProductModel) -> ProductModel:
return self.product_repo.create_product(product=product)
Note: One of the limitations of Depends is that currently there is no straight forward way to use it outside of FastAPI context, but luckily you can still use and combine it with a powerful tool like https://github.com/ets-labs/python-dependency-injector to build a robust decoupled modules FastAPI example
Final result:
#Josu16
I also had the same problem in FastAPI class as dependency, what I did and it may funny, is I created instance of the ProductService and pass the ProductRepository itslef to the argument of it. out side of the #router
pro_service = ProductService(ProductRepository)
it may notbest practice but it just work.
the product service file is like this :
def __init__(self, repo: ProductRepositoryABC = Depends()) -> None:
I'm trying to figureout the way based on Depends=()
Related
Situation is simple - IN A GLOBAL LIBRARY (OUTSIDE THE SANDBOX):
in src - a.b.c.Utils.groovy
in vars - Defaults.groovy
How do I call Defaults.groovy from within Utils.groovy?
In src:
#!groovy
package a.b.c
public class Utils implements Serializable {
def script
public def run() {
println(Defaults.text)
//groovy.lang.MissingPropertyException: No such property: Defaults for class: a.b.c.Utils
}
}
in vars:
#!groovy
public class Defaults {
public static def text = "hello world"
}
in Jenkinsfile:
#Library("ItLoads")
utils = new a.b.c.Utils(script:this)
...
utils.run()
so I tried to load the library explicitly
#!groovy
package a.b.c
public class Utils implements Serializable {
def script
public def run() {
println(script.library("ItLoads").Defaults.text)
//Only using first definition of library ItLoads
//java.lang.IllegalAccessException: Defaults was defined in file:///apps/opt/.../vars/Ansible.groovy which was not inside file:///apps/opt/.../src/
}
}
So, Defaults is defined somewhere, but I have no idea how to get to it...
If I try to use Defaults in the Jenkinsfile, it works.
HELP
Looking at your code I can see that you're not using the script object to access Defaults:
Instead of Defaults.text you should use something like script.Defaults.text.
Adding some links as a reference for future readers:
https://jenkins.io/doc/book/pipeline/shared-libraries/
How to access pipeline DSL in groovy classes(and not in Jenkinsfile)?
Using below configuration I am able to connect samza to kafka-broker
systems.kafka.samza.factory=org.apache.samza.system.kafka.KafkaSystemFactory
systems.kafka.samza.msg.serde=json
systems.kafka.consumer.zookeeper.connect=localhost:2181/
systems.kafka.producer.bootstrap.servers=localhost:9092
But I'm have some doubts regarding SystemFactory class. How to write our own systemfactory class? and what is the purpose of SystemFactoryClass? please give me some idea
You can write your own system factory class by extending the SystemFactory interface and implementing its three abstract functions, getConsumer, getProducer, and getAdmin. In each one of the functions, take getConsumer as an example, you want to create a system customer, an instance of another customized class extending SystemConsumer and defining how the system should consume. By doing so, your Samza job would know how to get the admin/consumer/producer of the system when needed.
Example (in Scala):
class YourSystemFactory extends SystemFactory {
override def getConsumer(systemName: String, config: Config, registry: MetricsRegistry): SystemConsumer = {
new YourSystemConsumer(
getAdmin(systemName, config).asInstanceOf[YourSystemAdmin],
config.get("someParam"))
}
override def getAdmin(systemName: String, config: Config): SystemAdmin = {
new YourSystemAdmin(
config.get("someParam"),
config.get("someOtherParam"))
)
}
override def getProducer(systemName: String, config: Config, registry: MetricsRegistry): SystemProducer = {
new YourSystemProducer(
getAdmin(systemName, config).asInstanceOf[YourSystemAdmin],
config.get("someParam"))
}
}
In your config:
# Your system params
systems.your.samza.factory=your.package.YourSystemFactory
systems.your.consumer.param=value
systems.your.producer.param=value
You don't need implement your KafkaSystemFactory. You have just implement StreamTask
Example :
public class MyTaskClass implements StreamTask {
public void process(IncomingMessageEnvelope envelope, MessageCollector collector, TaskCoordinator coordinator) {
// process message
}
}
Config :
# This is the class above, which Samza will instantiate when the job is run
task.class=com.example.samza.MyTaskClass
# Define a system called "kafka" (you can give it any name, and you can define
# multiple systems if you want to process messages from different sources)
systems.kafka.samza.factory=org.apache.samza.system.kafka.KafkaSystemFactory
# The job consumes a topic called "PageViewEvent" from the "kafka" system
task.inputs=kafka.PageViewEvent
# Define a serializer/deserializer called "json" which parses JSON messages
serializers.registry.json.class=org.apache.samza.serializers.JsonSerdeFactory
# Use the "json" serializer for messages in the "PageViewEvent" topic
systems.kafka.streams.PageViewEvent.samza.msg.serde=json
For more info : Documentation
I thought DI was implemented to allow use the same services over the application, and change them as needed. However this snippet (Angular 2.0.0-beta.0) refuses to work:
# boot.ts
import {ProjectService} from './project.service'
bootstrap(AppComponent, [ProjectService]);
# my.component.ts
export class MyComponent {
constructor(project: ProjectService) {
}
}
and with explicit service requirement it works:
# my.component.ts
import {ProjectService} from './project.service';
export class MyComponent {
constructor(project: ProjectService) {
}
}
The official doc is somewhat inconsistent, but has the same in the plunkr example:
# boot.ts
import {HeroesListComponent} from './heroes-list.component';
import {HeroesService} from './heroes.service';
bootstrap(HeroesListComponent, [HeroesService])
# heroes-list.component.ts
import {HeroesService} from './heroes.service';
Is this the intended way of DI usage? Why we have to import service in every class requiring it, and where are the benefits if we can't just describe the service once on boot?
This isn't really related to dependency injection. You can't use a class in TS that is not imported.
This line references a class and DI derives from the type what instance to inject.
constructor(project: ProjectService) {
If the type isn't specified by a concrete import, DI can't know which of all possible ProjectService classes should be used.
What you can do for example, is to request a type (ProjectService) and get a different implementation (subclass like MockProjectService or EnhancedProjectService,...)
bootstrap(HeroesListComponent, [provide(ProjectService useClass: MockProjectService)]);
this way DI would inject a MockProjectService for the following constructor
constructor(project: ProjectService) {
I have tried, under Plone 4.3.3, to customize a class method of an archetype content type in one of my products.
I have a product bsw.produit_1 with a content type MyContent defined as follows:
class MyContent(base.ATCTContent):
implements(IMyContent)
meta_type = "MyContent"
schema = MyContent`
def ma_fonction(self):
......
return res
I want to modify the code of my function ma_fonction in another product. I have tried using an adapter and following the plone docs, but without success.
The class where I wish to customize the function:
class CustomClass(object):
""" """
implements(IMyContent)
adapts(IMyContent)
def at_post_payment_script(self, obj_transaction):
""" """
......
# My new code
return res
The configure.zcml where I declared my adapter:
<adapter for="bsw.produit_1.content.mycontent.MyContent"
provides="bsw.produit_1.interfaces.IMyContent"
factory=".customclass.CustomClass" />
In my zcml declaration, I've also tried putting archetypes.schemaextender.interfaces.ISchemaExtender as provides or putting the interface IMyContent for for instead of the class.
None of these worked, every time, the customized code is not executed. Does anybody have a solution for this?
The solution you need depends on what you want to achieve.
But archetypes.schemaextender is the wrong solution.
schemaextender is there to modify the schema, this includes:
fields order
field/widget attributes
schemata
setter/getter of a field
new fields
override fields
To implement your own adaptera is definitely the right approach.
First you need to implement a adapter for the default behavior.
Second, you need to adapt the context and the request. The request is important, since that's a way to define a more specific adapter if your other product is installed.
Python code for the default implementation (adapter.py):
from zope.component import adapts
from zope.interface import Interface
from zope.interface import implements
class IBehavior(Interface):
def __init__(context, request)
"""Adapts context and request"""
# ... more ...
class DefaultBehavior(object):
implements(IBehavior)
adapts(IMyContent, Interface) # IMPORTAN two discriminators
def __init__(self, context, request):
self.context = context
self.request = request
def __call__(self):
# your default implementation goes here.
Register the adapter with zcml:
<adapter factory=".adapter.DefaultBehavior" />
Your now able to call the default adapter in ma_fonction
from zope.component import getMultiAdapter
class MyContent(base.ATCTContent)
def ma_fonction(self):
adapter = getMultiAdapter((self, self.REQUEST), IDefaultBehavior)
return adapter()
Now you can implement a more specific adapter in your other product using a browserlayer. Check documentation, how to register a browserlayer
In your otherpackage you can now register a adapter which implements the same IBehavior interface, but also adapts your browserlayer.
from other.package.interfaces import IOtherPackageLayer
from zope.component import adapts
from zope.interface import implements
class DifferenBehavior(object):
implements(IBehavior)
adapts(IMyContent, IOtherPackageLayer) # IMPORTAN adapt the browserlayer not Interface
def __init__(self, context, request):
self.context = context
self.request = request
def __call__(self):
# your different implementation goes here.
Register also with zcml:
<adapter factory=".adapter.DifferenBehavior" />
Your ma_fonctionnow calls the default adapter, if the other package is not installed. And the different adapter if the other package is installed.
The simplest method you can use (although not politically correct!) is monkey-patching.
Take a look at collective.monkeypatcher, you simply need a configuration like that (in your 3rd party product):
<monkey:patch
description=""
class="your.package.MyContent"
original="ma_fonction"
replacement=".monkeys.new_ma_fonction"
/>
Then in your package create also a monkeys.py module with the new method inside:
def new_ma_fonction(self):
# do stuff
return res
Relating to Accessing grails application config from a quartz job:
Apparently, DI doesn't happen prior to the creation of a job. I'm guessing this is the same with other grails artefacts (couldn't spot relevant documentation).
In my particular case, I was aiming to load a property from config and expose that property from the job class. In general though, it seems a valid use-case to me, that artefacts will load configuration, and then return those properties via API.
I'm wondering then, how could this be achieved when a class cannot rely on access to grailsApplication.config at construction.
Thanks
Try with:
import org.codehaus.groovy.grails.commons.ConfigurationHolder as CH
class MyJob {
def execute() {
def myConfigVar = CH.flatConfig.get('my.var.setup.in.config.groovy')
...
}
}
OR
import grails.util.Holders
class MyJob {
def execute() {
def myConfigVar = Holders.config.my.var.setup.in.config.groovy
...
}
}