How to automatically save data from url to database

爷,独闯天下 提交于 2019-12-13 23:02:26

问题


I'm trying to build a news feed. I have an Ember frontend built and a Rails API backend built and they are talking to each other. I have the feedsController, feeds model, and migration built in the backend. I have data stored in my database and displaying correctly on my Ember frontend using faker. Easy enough. What I'm trying to do is load data (title, image, author, etc) from NewsApi - by keyword - into my rails app, save it to my database, and then display it on my frontend. How do I load and save this data automatically in my rails app? I am looking for an automated solution - 1: load news stories by keyword, 2: save to database, 3: display. I have limited experience with backend problems, but can work my way through. I do not see any tutorials with this specific problem though. If you'd like to see my code, I can paste it in, but the files are essentially boilerplate with your basic index, create, destroy, etc. I understand the basics of Rails, I just can't seem to figure this out. Thanks in advance!

Edit - I've made some progress over the last day or so. I now have data coming from newsapi and can display it in localhost:3000/feeds. My model:

class Feed < ActiveRecord::Base
 require 'rest_client'

  @url

  def self.getData
    response = RestClient.get(@url, { :content_type => :json })
  end

  def self.retrieve_results
    @url = "https://newsapi.org/v2/everything?q=marijuana&apiKey=#{ENV['NEWS_API_KEY']}"
    JSON.parse(Feed.getData)
  end
end  

And my controller:

class FeedsController < ApplicationController
  before_action :set_feed, only: [:show, :create]

  def index
    @feeds = Feed.retrieve_results()
    render json: @feeds
    # redirect_to :action => 'create'  ??  Maybe?
  end

  def create

    I believe I need code here

  end

  private
    def set_feed
      @feed = Feed.find(params[:id])
    end

    def feed_params
      params.require(:feed).permit(:name, :summary, :url, :published_at, :guid)
    end
end  

Data:

{
"status": "ok",
"totalResults": 6988,
"articles": [
{
"source": {
"id": "mashable",
"name": "Mashable"
},
"author": "Lacey Smith",
"title": "Cannabis may alter the genetic makeup of sperm",
"description": "A new study suggests that marijuana use can not only lower sperm count, but also affect sperm DNA. Read more... More about Marijuana, Cannabis, Sperm, Science, and Health",
"url": "https://mashable.com/video/cannabis-sperm-dna/",
"urlToImage": "https://i.amz.mshcdn.com/cdBWehMuVAb4DgU9flYo0lQTyT8=/1200x630/2019%2F01%2F16%2F18%2F1421dae3db754d0a8c4276e524b47f7f.23835.jpg",
"publishedAt": "2019-01-16T20:13:12Z",
"content": null
},
{
"source": {
"id": "the-new-york-times",
"name": "The New York Times"
},
"author": "ALEX BERENSON",
"title": "What Advocates of Legalizing Pot Don’t Want You to Know",
"description": "The wave toward legalization ignores the serious health risks of marijuana.",
"url": "https://www.nytimes.com/2019/01/04/opinion/marijuana-pot-health-risks-legalization.html",
"urlToImage": "https://static01.nyt.com/images/2019/01/04/opinion/04berenson/04berenson-facebookJumbo.jpg",
"publishedAt": "2019-01-05T02:14:00Z",
"content": "Meanwhile, legalization advocates have squelched discussion of the serious mental health risks of marijuana and THC, the chemical responsible for the drugs psychoactive effects. As I have seen firsthand in writing a book about cannabis, anyone who raises thos… [+1428 chars]"
},  

Now I need to iterate through each feed, pull the correct info out (source, author, title, description, etc - basically all of it) and save it to my own database columns. How do I save this data automatically when I get a response from the url? I also need a way to make certain that I am only saving the story once, and then the GET & CREATE will simply update the database as new stories come in. Again, thanks for any help!


回答1:


I am not sure I understand the question. You need to use something like fetch. If you are using fetch it should push the newly created data to the rails backend if you set it up correctly to do so. Using POST is your autosave. You mentioned trying to load data, but why is GET not working? GET should load the data and POST should post and save the data. I come from React and Redux, but you should be able to use something like axios to make setting it up easier. Let me know if I did not answer your question.



来源:https://stackoverflow.com/questions/54331987/how-to-automatically-save-data-from-url-to-database

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!